I'm trying to integrate R into Scala using JVMR. I am getting a NoSuchMethodError when attempting to instantiate RInScala.
I'm working on a Windows 7 machine with R installed under C:\Program Files\R\R-3.1.1 and Scala version 2.11.1 is installed under C:\Program Files (x86)\scala. I'm developing in IntelliJ with the Scala plugin and am using the Scala worksheet just to test this out as a POC. My Scala project does show JVMR 2.11-2.11.1.1.jar as an included library. The worksheet is very basic at present - just the import and the instantiation attempt.
import org.ddahl.jvmr.RInScala
val R = RInScala()
When running the worksheet in IntelliJ, I see the following output, so I can tell that it's successfully importing the class, but can't instantiate.
import org.ddahl.jvmr.RInScala
java.lang.NoSuchMethodError: scala.runtime.ObjectRef.create(Ljava/lang/Object;)Lscala/runtime/ObjectRef;
at org.ddahl.jvmr.RInScala$.findROnWindows(RInScalaTest.sc2318647708135405919.tmp:804)
at org.ddahl.jvmr.RInScala$.defaultExecutable$lzycompute(RInScalaTest.sc2318647708135405919.tmp:822)
at org.ddahl.jvmr.RInScala$.defaultExecutable(RInScalaTest.sc2318647708135405919.tmp:821)
at org.ddahl.jvmr.RInScala.<init>(RInScalaTest.sc2318647708135405919.tmp:28)
at org.ddahl.jvmr.RInScala$.apply(RInScalaTest.sc2318647708135405919.tmp:838)
at com.xxxx.r_in_scala.A$A1$A$A1.R$lzycompute(RInScalaTest.sc2318647708135405919.tmp:2)
at com.xxxx.r_in_scala.A$A1$A$A1.R(RInScalaTest.sc2318647708135405919.tmp:2)
at com.xxxx.r_in_scala.A$A1$A$A1.get$$instance$$R(RInScalaTest.sc2318647708135405919.tmp:2)
at #worksheet#.#worksheet#(RInScalaTest.sc2318647708135405919.tmp:10)
I've drilled into the code for findROnWindows and my installation should be found based on the values of the registry keys that are being read. I'm sure I'm missing something simple, but am at that "I've been looking at the problem for too long without figuring it out and just need a new set of eyes" stage.
To my knowledge this is currently some kind of bug in the worksheet implementation by IntelliJ or indeed a misconfiguration. To verify:
Put your code in an object
package starter
object Test extends App {
import org.ddahl.jvmr.RInScala
val R = RInScala()
println("works")
}
and try to run it as a scala app and not from the worksheet.
Related
I trying to run tutorial code from VWorkflow (I've just copied code and only package changed). Before this code it works. I have requires and dependencies in code to this library. Everythings seems to be OK but only ScalableContentPane is not working and because this my code doesn't run. I add jfxtras dependences in pom.xml
<dependency>
<groupId>org.jfxtras</groupId>
<artifactId>jfxtras-labs</artifactId>
<version>8.0-r6</version>
</dependency>
And also I add requires to module-info.java
requires jfxtras.labs;
And my importing code.
import jfxtras.labs.scene.layout.ScalableContentPane;
But when I trying to run this app build output gives me that error
java: the unnamed module reads package jfxtras.labs.util.event from both jfxtras.labs and vworkflows.fx
as main error. And another error in module-info errors
java: module com.example.learningfx reads package jfxtras.labs.util.event from both jfxtras.labs and vworkflows.fx
There is all errors
I'm using IntelliJ and use JavaFX to build.
If you need VWorkflow github.
What do I need to do?
First of all you're using JFXtras labs, which is experimental and unsupported code, so if it does not work anymore, then that's it. It's called labs for a reason :-)
But apparently you are using a version of Java with JPMS, 9+, then you should also use JFXtras 9 or 11. You are currently using JFXtras 8.
I am new to Pact.
I downloaded the code from Github, "pact-jvm" project.
I created a new project in IntelliJ from "existing source" with Gradle setting.
It imported all the packages fine.
However, when I tried running some of the tests in it, I got a
cannot find symbol
error as the following,
Error:(5, 30) java: cannot find symbol
symbol: class PactFragment
location: package au.com.dius.pact.model
I looked at my package 'package au.com.dius.pact.model' and I found that "PactFragment" is missing from this package.
In addition to that, the following classes are also missing in this package.
import au.com.dius.pact.matchers.MatchingConfig
import au.com.dius.pact.model.BodyMismatch
import au.com.dius.pact.model.BodyTypeMismatch
import au.com.dius.pact.model.DiffConfig
import au.com.dius.pact.model.HeaderMismatch
import au.com.dius.pact.model.ResponseMatching$
import au.com.dius.pact.model.ResponsePartMismatch
import au.com.dius.pact.model.StatusMismatch
However, I did a "git pull" on all the source code from Github for the "pact-jvm" project. When I do "git pull", it shows that everything is "Already up-to-date."
Any pointers what I might be missing?
Why am I missing so many classes in this package, 'package au.com.dius.pact.model'?
Thanks,
Eric
After re-importing the project from scratch as a Gradle project and install the scala plugin. The problem is solved.
I'm working on a project to write invoices to an excel workbook in PeopleSoft 9.2 using PeopleTools 8.54. In our old version (8.49) we did this:
&oWorkApp_Inv = CreateObject("COM", "Excel.Application");
&oWorkApp_Inv.DisplayAlerts = "False";
&oWorkBook_Inv = ObjectGetProperty(&oWorkApp_Inv, "Workbooks");
Doing the same in 8.54, I get an error that the application class COM is not found. I've researched through PeopleBooks and it suggested doing exactly what I'm doing, even with COM as the class. What can I do to fix this, and in what package can I find COM?
The two requirements for using the COM object are:
The server is running Windows
Excel is installed on the server
I created an App Engine is 8.54.13 and it ran successfully on my PSNT process scheduler.
Local object &excel;
&excel = CreateObject("COM", "Excel.Application");
&excel.quit();
I have a local spark 1.5.2 (hadoop 2.4) installation on Windows as explained here.
I'm trying to import a jar file that I created in Java using maven (the jar is jmatrw that I uploaded on here on github). Note the jar does not include a spark program and it has no dependencies to spark. I tried the following steps, but no one seems to work in my installation:
I copied the library in "E:/installprogram/spark-1.5.2-bin-hadoop2.4/lib/jmatrw-v0.1-beta.jar"
Edit spark-env.sh and add SPARK_CLASSPATH="E:/installprogram/spark-1.5.2-bin-hadoop2.4/lib/jmatrw-v0.1-beta.jar"
In a command window I run > spark-shell --jars "E:/installprogram/spark-1.5.2-bin-hadoop2.4/lib/jmatrw-v0.1-beta.jar", but it says "Warning: skip remote jar"
In the spark shell I tried to do scala> sc.addJar("E:/installprogram/spark-1.5.2-bin-hadoop2.4/lib/jmatrw-v0.1-beta.jar"), it says "INFO: added jar ... with timestamp"
When I type scala> import it.prz.jmatrw.JMATData, spark-shell replies with error: not found: value it.
I spent lot of time searching on Stackoverflow and on Google, indeed a similar Stakoverflow question is here, but I'm still not able to import my custom jar.
Thanks
There are two settings in 1.5.2 to reference an external jar. You can add it for the driver or for the executor(s).
I'm doing this by adding settings to the spark-defaults.conf, but you can set these at spark-shell or in SparkConf.
spark.driver.extraClassPath /path/to/jar/*
spark.executor.extraClassPath /path/to/jar/*
I don't see anything really wrong with the way you are doing it, but you could try the conf approach above, or setting these using SparkConf
val conf = new SparkConf()
conf.set("spark.driver.extraClassPath", "/path/to/jar/*")
val sc = new SparkContext(conf)
In general, I haven't enjoyed working with Spark on Windows. Try to get onto Unix/Linux.
I am trying to invoke the method typeOfInstance() in the following (simplest) code:
import scala.reflect.mirror._
class Bar
object Main extends App {
val bar = new Bar()
typeOfInstance(bar)
}
but I am receiving an AssertionError while executing it:
java.lang.AssertionError: assertion failed: no symbol could be loaded from package annotation (scala equivalent is class com.hablapps.annotation.Bar) by name Bar
The above code runs fine in the REPL (with :power mode). The problem arises while running from SBT (with Scala 2.10-M3 set). Does anybody know what could be happening?
This is a known issue with M3.
In that preview version of Scala, reflection only works with straightforward classloading schemes (e.g. when you run your application using good old java -cp <classpath> <name of the main class>). SBT is a bit more involved, and things blow up.
We've fixed this in 2.10.0-M4.