package EJB module as EAR or JAR? - jar

Should this hello world EJB be packages as an EAR or JAR for deployment?
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ cat src/java/net/bounceme/dur/ejb/NewSessionBean.java
package net.bounceme.dur.ejb;
import java.util.logging.Logger;
import javax.ejb.Stateless;
#Stateless(name = "FooBean", mappedName = "ejb/FooBean")
public class NewSessionBean implements NewSessionBeanRemote {
private final static Logger log = Logger.getLogger(NewSessionBean.class.getName());
#Override
public void sayHello() {
log.info("hi");
}
}
thufir#dur:~/NetBeansProjects/EJBModule1$
deploy:
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ asadmin list-applications
Nothing to list.
No applications are deployed to this target server.
Command list-applications executed successfully.
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ ant -p
Buildfile: /home/thufir/NetBeansProjects/EJBModule1/build.xml
Builds, tests, and runs the project EJBModule1.
Main targets:
-profile-pre72 Profile a J2EE project in the IDE.
clean Clean build products.
compile Compile project.
debug Debug project in IDE.
default Build whole project.
dist Build distribution (JAR).
dist-directory-deploy Build distribution (JAR) - if directory deployment is not supported.
dist-ear Build distribution (JAR) to be packaged into an EAR.
javadoc Build Javadoc.
profile Profile a J2EE project in the IDE.
run Deploy to server.
test Run unit tests.
test-single Run single unit test.
test-single-method Run single unit test.
Default target: default
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ ant run
Buildfile: /home/thufir/NetBeansProjects/EJBModule1/build.xml
-pre-init:
-init-private:
-init-userdir:
-init-user:
-init-project:
-init-macrodef-property:
-do-init:
-post-init:
-init-check:
-init-ap-cmdline-properties:
-init-macrodef-javac-with-processors:
-init-macrodef-javac-without-processors:
-init-macrodef-javac:
-init-macrodef-test-impl:
-init-macrodef-junit-init:
-init-macrodef-junit-single:
-init-test-properties:
-init-macrodef-junit-batch:
-init-macrodef-junit:
-init-macrodef-junit-impl:
-init-macrodef-testng:
-init-macrodef-testng-impl:
-init-macrodef-test:
-init-macrodef-junit-debug:
-init-macrodef-junit-debug-batch:
-init-macrodef-junit-debug-impl:
-init-macrodef-test-debug-junit:
-init-macrodef-testng-debug:
-init-macrodef-testng-debug-impl:
-init-macrodef-test-debug-testng:
-init-macrodef-test-debug:
-init-macrodef-java:
-init-debug-args:
-init-macrodef-nbjpda:
-init-macrodef-debug:
-init-taskdefs:
-init-ap-cmdline-supported:
-init-ap-cmdline:
init:
-init-cos:
-init-deploy:
-deps-module-jar:
-deps-ear-jar:
deps-jar:
-pre-pre-compile:
-pre-compile:
-copy-meta-inf:
-do-compile:
-post-compile:
compile:
-library-inclusion-in-archive-weblogic:
-library-inclusion-in-archive-by-user:
library-inclusion-in-archive:
-pre-dist:
-do-tmp-dist-without-manifest:
-do-tmp-dist-with-manifest:
-do-dist-directory-deploy:
-post-dist:
dist-directory-deploy:
pre-run-deploy:
-pre-nbmodule-run-deploy:
-run-deploy-nb:
-init-deploy-ant:
-init-cl-deployment-env:
-parse-glassfish-web:
-parse-sun-web:
-no-parse-sun-web:
-add-resources:
-deploy-ant:
-deploy-without-pw:
[echo] Deploying dist/EJBModule1.jar
[get] Getting: http://localhost:4848/__asadmin/deploy?path=/home/thufir/NetBeansProjects/EJBModule1/dist/EJBModule1.jar&force=true&name=EJBModule1
[get] To: /tmp/gfv3945180939
[delete] Deleting: /tmp/gfv3945180939
-deploy-with-pw:
-run-deploy-am:
-post-nbmodule-run-deploy:
post-run-deploy:
-do-update-breakpoints:
run-deploy:
run:
BUILD SUCCESSFUL
Total time: 1 second
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ asadmin list-applications
EJBModule1 <ejb>
Command list-applications executed successfully.
thufir#dur:~/NetBeansProjects/EJBModule1$
thufir#dur:~/NetBeansProjects/EJBModule1$ cat src/java/net/bounceme/dur/ejb/NewSessionBean.java
package net.bounceme.dur.ejb;
import java.util.logging.Logger;
import javax.ejb.Stateless;
#Stateless(name = "FooBean", mappedName = "ejb/FooBean")
public class NewSessionBean implements NewSessionBeanRemote {
private final static Logger log = Logger.getLogger(NewSessionBean.class.getName());
#Override
public void sayHello() {
log.info("hi");
}
}
thufir#dur:~/NetBeansProjects/EJBModule1$
It's just a "module"; to my understanding this means that there's no web component (WAR file). But it could still be an EAR...or not?
(The build file/etc are all defaults from Netbeans.)

Ear is used to bundle both Jar modules and war modules togeather.
Hence Ear module makes sense when your enterprise application has:
multiple Jar modules
multiple war modules
multiple Jar and war modules.
Also, there is classpath implications in the above setup.
If my application has just a single jar module or war module, I would deploy it directly.

Related

.net core 3.1 remote debugging from VS 2019 CE to Raspberry PI - vsdbg exits when the app wants to handle a GPIO-interrupt

My App handles GPIO interrupts on a Raspberry PI 3B+ using the Unosquare.Raspberry.IO and Unosquare.WiringPi nuget packages. Since it has to handle multiple interrupts, it sets up a task for each of them and then waits on the resulting task list.
The setup for the interrupt is as follows:
public async Task StartCounterAsync(CancellationToken cancelToken)
{
logger.LogInformation($"Starting Counter {counterName}");
var counterPin = Pi.Gpio[inputPin];
counterPin.PinMode = GpioPinDriveMode.Input;
counterPin.RegisterInterruptCallback(EdgeDetection.RisingEdge, HandleCounterInterrupt);
await cancelToken;
}
private void HandleCounterInterrupt()
{
logger.LogInformation("HandleCounterInterrupt");
}
}
When I launch the application via ssh on the raspi, the application runs fine and the interrupt handling works without any problems.
If I launch the application via "DebugAdapterHost.Launch /LaunchJson:launch.json" it runs fine until I press the button to create the interrupt. Then vsdgb simply exits:
1> [DebugAdapter] <-- E (thread): {"seq":97,"type":"event","event":"thread","body":{"reason":"started","threadId":1340}}
*** Here I press the button ***
1> [DebugAdapter] <-- E (exited): {"seq":98,"type":"event","event":"exited","body":{"exitCode":0}}
1> [DebugAdapter] <-- E (terminated): {"seq":99,"type":"event","event":"terminated","body":{}}
1> [DebugAdapter] --> C (disconnect-15): {"type":"request","command":"disconnect","arguments":{},"seq":15}
I have tried this with vsdbg running as a normal user and running as root with the same result. This is no crash, vsdbg just exits. A breakpoint inside HandleCounterInterrupt() is never reached.
Environment:
Raspberry 3 with raspbian
.net core 3.1.4 and vsdbg installed according to this: https://github.com/Microsoft/MIEngine/wiki/Offroad-Debugging-of-.NET-Core-on-Linux---OSX-from-Visual-Studioand Scott Hanselman suggestions:https://www.hanselman.com/blog/RemoteDebuggingWithVSCodeOnWindowsToARaspberryPiUsingNETCoreOnARM.aspx
VS 2019 16.6.2 Community Edition
Launch.json (with root setup):
{
"version": "0.2.0",
"adapter": "plink.exe",
"adapterArgs": "-i C:/Users/ur/source/repos/devpikey.ppk root#devpi -batch -T /home/pi/vsdbg/vsdbg --interpreter=vscode",
"configurations": [
{
"name": ".NET Core Launch",
"type": "coreclr",
"cwd": "/home/pi/PiS0Counter",
"program": "/home/pi/dotnet-arm32/dotnet",
"args": "PiS0Counter.dll",
"request": "launch"
}
]
}

Why did not found return type in Custom UDAF of presto

Custom Function what I made did not work in presto on EMR.
I want to create a simple UDAF that just return 42.
First, my custom function what I wrote a simple functions, but did not work in presto.
A error is following in presto-cli:
presto> select answer_to_life('the universe');
Query 20180324_120433_00000_7n6s6 failed: answer_to_life(varchar):bigint not found
com.facebook.presto.spi.PrestoException: answer_to_life(varchar):bigint not found
at com.facebook.presto.metadata.FunctionRegistry.doGetSpecializedFunctionKey(FunctionRegistry.java:972)
at com.google.common.cache.CacheLoader$FunctionToCacheLoader.load(CacheLoader.java:146)
at com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3716)
at com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2424)
at com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2298)
at com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2211)
at com.google.common.cache.LocalCache.get(LocalCache.java:4154)
at com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:4158)
at com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:5147)
at com.google.common.cache.LocalCache$LocalLoadingCache.getUnchecked(LocalCache.java:5153)
at com.facebook.presto.metadata.FunctionRegistry.getSpecializedFunctionKey(FunctionRegistry.java:898)
at com.facebook.presto.metadata.FunctionRegistry.getAggregateFunctionImplementation(FunctionRegistry.java:875)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.buildAccumulatorFactory(LocalExecutionPlanner.java:1973)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.planGlobalAggregation(LocalExecutionPlanner.java:1984)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:955)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.AggregationNode.accept(AggregationNode.java:167)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitExchange(LocalExecutionPlanner.java:1919)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitExchange(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.ExchangeNode.accept(ExchangeNode.java:196)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:952)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitAggregation(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.AggregationNode.accept(AggregationNode.java:167)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitOutput(LocalExecutionPlanner.java:638)
at com.facebook.presto.sql.planner.LocalExecutionPlanner$Visitor.visitOutput(LocalExecutionPlanner.java:596)
at com.facebook.presto.sql.planner.plan.OutputNode.accept(OutputNode.java:82)
at com.facebook.presto.sql.planner.LocalExecutionPlanner.plan(LocalExecutionPlanner.java:393)
at com.facebook.presto.sql.planner.LocalExecutionPlanner.plan(LocalExecutionPlanner.java:324)
at com.facebook.presto.execution.SqlTaskExecution.<init>(SqlTaskExecution.java:161)
at com.facebook.presto.execution.SqlTaskExecution.createSqlTaskExecution(SqlTaskExecution.java:121)
at com.facebook.presto.execution.SqlTaskExecutionFactory.create(SqlTaskExecutionFactory.java:71)
at com.facebook.presto.execution.SqlTask.updateTask(SqlTask.java:340)
at com.facebook.presto.execution.SqlTaskManager.updateTask(SqlTaskManager.java:321)
at com.facebook.presto.server.TaskResource.createOrUpdateTask(TaskResource.java:128)
at sun.reflect.GeneratedMethodAccessor311.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.glassfish.jersey.server.model.internal.ResourceMethodInvocationHandlerFactory.lambda$static$0(ResourceMethodInvocationHandlerFactory.java:76)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher$1.run(AbstractJavaResourceMethodDispatcher.java:148)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.invoke(AbstractJavaResourceMethodDispatcher.java:191)
at org.glassfish.jersey.server.model.internal.JavaResourceMethodDispatcherProvider$ResponseOutInvoker.doDispatch(JavaResourceMethodDispatcherProvider.java:200)
at org.glassfish.jersey.server.model.internal.AbstractJavaResourceMethodDispatcher.dispatch(AbstractJavaResourceMethodDispatcher.java:103)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.invoke(ResourceMethodInvoker.java:493)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:415)
at org.glassfish.jersey.server.model.ResourceMethodInvoker.apply(ResourceMethodInvoker.java:104)
at org.glassfish.jersey.server.ServerRuntime$1.run(ServerRuntime.java:277)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:272)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:268)
at org.glassfish.jersey.internal.Errors.process(Errors.java:316)
at org.glassfish.jersey.internal.Errors.process(Errors.java:298)
at org.glassfish.jersey.internal.Errors.process(Errors.java:268)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:289)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:256)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:703)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:416)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:370)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:389)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:342)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:229)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:841)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1650)
at io.airlift.http.server.TraceTokenFilter.doFilter(TraceTokenFilter.java:63)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1637)
at io.airlift.http.server.TimingFilter.doFilter(TimingFilter.java:52)
at org.eclipse.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1637)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:533)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:143)
at org.eclipse.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:454)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:190)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1253)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:168)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:473)
at org.eclipse.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:166)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1155)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerCollection.handle(HandlerCollection.java:126)
at org.eclipse.jetty.server.handler.StatisticsHandler.handle(StatisticsHandler.java:169)
at org.eclipse.jetty.server.handler.HandlerList.handle(HandlerList.java:61)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:132)
at org.eclipse.jetty.server.Server.handle(Server.java:564)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:317)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:251)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:279)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:110)
at org.eclipse.jetty.io.ChannelEndPoint$2.run(ChannelEndPoint.java:124)
at org.eclipse.jetty.util.thread.Invocable.invokePreferred(Invocable.java:128)
at org.eclipse.jetty.util.thread.Invocable$InvocableExecutor.invoke(Invocable.java:222)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:294)
at org.eclipse.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:199)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:673)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:591)
at java.lang.Thread.run(Thread.java:748)
Code of AggregationFunction is following.
I referenced to this presto code
package sample;
import com.facebook.presto.spi.block.BlockBuilder;
import com.facebook.presto.spi.function.*;
import com.facebook.presto.spi.type.BigintType;
import com.facebook.presto.spi.type.StandardTypes;
import io.airlift.slice.Slice;
#AggregationFunction("answer_to_life")
public final class AnswerToLife {
private AnswerToLife() {
}
#InputFunction
public static void input(#AggregationState NullState state, #SqlType(StandardTypes.VARCHAR) Slice value) {
}
#CombineFunction
public static void combine(#AggregationState NullState state, #AggregationState NullState other) {
}
#OutputFunction(StandardTypes.BIGINT)
public static void output(#AggregationState NullState state, BlockBuilder out) {
BigintType.BIGINT.writeLong(out, 42);
}
}
Detail code is here (https://github.com/asari-mtr/presto-udaf/tree/stackoverflow).
The structure what I deployed is follwoing:
$ ls -1 /usr/lib/presto/plugin/my-udaf/
commons-codec-1.4.jar
guava-21.0.jar
hive-udf-1.0-SNAPSHOT.jar
presto-array-0.197.jar
stats-0.155.jar
presto-udaf-1.0-SNAPSHOT.jar
I use emr-5.12.0(Presto 0.188)
Thank you for your time.
Edited1
list in jar file.
% jar -tf target/presto-udaf-1.0-SNAPSHOT.jar
Picked up _JAVA_OPTIONS: -Dfile.encoding=UTF-8
META-INF/
META-INF/MANIFEST.MF
META-INF/services/
sample/
META-INF/services/com.facebook.presto.spi.Plugin
sample/AnswerToLife.class
sample/AnswerToLifePlugin.class
sample/NullState.class
META-INF/maven/
META-INF/maven/sample/
META-INF/maven/sample/presto-udaf/
META-INF/maven/sample/presto-udaf/pom.xml
META-INF/maven/sample/presto-udaf/pom.properties
sever.log
$ grep answer -i /mnt/var/log/presto/server.log
2018-03-26T06:37:14.213Z INFO main com.facebook.presto.server.PluginManager Installing sample.AnswerToLifePlugin
2018-03-26T06:37:14.214Z INFO main com.facebook.presto.server.PluginManager Registering functions from sample.AnswerToLife
And execute show funcstions
presto> show functions;
Function | Return Type | Argument Types
---------------------------------+-----------------------------------------------------+------------------------------------------------------------------------------
ST_Area | double | Geometry
ST_AsText | varchar | Geometry
...
answer_to_life | bigint | varchar
...
Edited2
Delete src/main/resources/META-INF/services/com.facebook.presto.spi.Plugin
Add <packaging>presto-plugin</packaging> to pom.xml
https://github.com/asari-mtr/presto-udaf/commit/f2a2ddbf0339e08f418b378a7ead511020e98a3b
I deployed zip made by Maven under /usr/lib/presto/plugin/.
But, The contents of the error does not change.
Edited3
I got the source from github (branch 0.188) on my Mac and built presto.
When we placed the above UDAF on its presto, it worked perfectly.
Perhaps there is a mistake in the installation procedure for presto on EMR.
Solved!!
The Custom plugin did not work because it was not a plugin creation, but a way of deploying on EMR was wrong.
Since I did not deploy to all nodes in the procedure I performed, I confirmed the operation by registering the following shell as a bootstrap action.
#/bin/bash
aws s3 cp s3://hogehoge/presto-udaf-1.0-SNAPSHOT.zip /tmp/
sudo mkdir -p /usr/lib/presto/plugin/
sudo unzip -d /usr/lib/presto/plugin/ /tmp/presto-udaf-1.0-SNAPSHOT.zip
10.1. SPI Overview — Presto 0.198 Documentation
Plugins must be installed on all nodes in the Presto cluster
(coordinator and workers).
Create Bootstrap Actions to Install Additional Software - Amazon EMR
You can use a bootstrap action to copy objects from Amazon S3 to each
node in a cluster before your applications are installed. The AWS CLI
is installed on each node of a cluster, so your bootstrap action can
call AWS CLI commands.
From what you wrote there are two things which are missing.
make sure that you used maven packaging presto-plugin in your pom.xml
also implement com.facebook.presto.spi.Plugin which in getFunctions method would return your function.
Take a look in presto-geospatial Presto module and see com.facebook.presto.plugin.geospatial.GeoPlugin and its pom.xml as reference.

Adding sbt native packager plugin in SBT

I have a very organized build file that is composed of the following scala files:
Build.scala - the main Build file
Dependencies.scala - where I define the dependencies and the versions
BuildSettings.scala - where I define the build settings
plugins.sbt
A snippet of the Build.scala is as below:
import sbt._
import Keys._
object MyBuild extends Build {
import Dependencies._
import BuildSettings._
import NativePackagerHelper._
// Configure prompt to show current project
override lazy val settings = super.settings :+ {
shellPrompt := { s => Project.extract(s).currentProject.id + " > " }
}
// Define our project, with basic project information and library dependencies
lazy val project = Project("my-project", file("."))
.settings(buildSettings: _*)
.settings(
libraryDependencies ++= Seq(
Libraries.scalaAsync
// Add your additional libraries here (comma-separated)...
)
).enablePlugins(JavaAppPackaging, DockerPlugin)
}
All the 4 files that I mentioned above are in the same directory which is inside the project directory. But when I run this build file, I get the following error:
not found value: NativePackagerHelper
Any clues why his this?
I figured out what the problem was. I had to use the following in my build.properties
sbt.version=0.13.11
I originally had 0.13.6 and it was causing the import statements to fail!

Storm-R integration

I am trying to integrate my R script with Storm. The code for my Rbolt is:
public class RBolt extends ShellBolt implements IRichBolt {
public RBolt() {
super("Rscript", "storm_OR.R");
}
#Override
public void declareOutputFields(OutputFieldsDeclarer outputFieldsDeclarer) {
outputFieldsDeclarer.declare(new Fields("OR"));
}
#Override
public Map<String, Object> getComponentConfiguration() {
Config ret = new Config();
ret.setMaxTaskParallelism(1);
return ret;
}
}
I am getting the following error. Any help? I have made sure that the path variables have path of R and Rscript.
17469 [Thread-12-__system] INFO backtype.storm.daemon.executor - Preparing bolt __system:(-1)
17474 [Thread-12-__system] INFO backtype.storm.daemon.executor - Prepared bolt __system:(-1)
17480 [Thread-6] INFO backtype.storm.daemon.executor - Loading executor RBolt:[1 1]
17483 [Thread-6] INFO backtype.storm.daemon.executor - Loaded executor tasks RBolt:[1 1]
17491 [Thread-6] INFO backtype.storm.daemon.executor - Finished loading executor RBolt:[1 1]
17491 [Thread-6] INFO backtype.storm.daemon.worker - Launching receive-thread for 8d8a13de-5e87-4e14-b2c2-59b4dfc070c6:1027
17493 [Thread-14-RBolt] INFO backtype.storm.daemon.executor - Preparing bolt RBolt:(1)
17496 [Thread-15-worker-receiver-thread-0] INFO backtype.storm.messaging.loader - Starting receive-thread: [stormId: EventProcessing-1-1457335172, port: 1027, thread-id: 0 ]
17500 [Thread-14-RBolt] INFO backtype.storm.utils.ShellProcess - Storm multilang serializer: backtype.storm.multilang.JsonSerializer
17510 [Thread-14-RBolt] ERROR backtype.storm.util - Async loop died!
java.lang.RuntimeException: Error when launching multilang subprocess
at backtype.storm.utils.ShellProcess.launch(ShellProcess.java:64) ~[storm-core-0.9.2-incubating.jar:0.9.2-incubating]
at backtype.storm.task.ShellBolt.prepare(ShellBolt.java:99) ~[storm-core-0.9.2-incubating.jar:0.9.2-incubating]
at backtype.storm.daemon.executor$fn__5641$fn__5653.invoke(executor.clj:690) ~[storm-core-0.9.2-incubating.jar:0.9.2-incubating]
at backtype.storm.util$async_loop$fn__457.invoke(util.clj:429) ~[storm-core-0.9.2-incubating.jar:0.9.2-incubating]
at clojure.lang.AFn.run(AFn.java:24) [clojure-1.5.1.jar:na]
at java.lang.Thread.run(Thread.java:745) [na:1.7.0_67]
Caused by: java.io.IOException: Cannot run program "Rscript" (in directory "/tmp/933c85f3-f5b5-4a60-b342-7d4969b43d46/supervisor/stormdist/EventProcessing-1-1457335172/resources"): error=2, No such file or directory
This directory in tmp folder does not exist and is created on the fly. Any suggestions please.
UPDATE: Resolved this by creating another resources folder in the resources folder of the project such that the jar has a resources folder with the R script in it.
The whole purpose of "shell" components is to start as an independent process, therefore your script needs to implement multilang protocol.
Alternatively you can find a library that implements the protocol and has R integration, like FsStorm: it implements multilang and you can call R functions via R type provider.

How to use Spark 1.2.0 in Play 2.2.3 project as it fails with NoSuchMethodError: akka.util.Helpers?

Have you ever had a problem with Play framework? In my case, first of all I have build all in one jar: spark-assebmly-1.2.0-hadoop2.4.0.jar, and Spark works perfectly from a shell. But there are two questions:
Should I use this assebmled Spark_jar in Play_project and how?? Because I try to move it into the lib_directiry and it did n`t help to provide any Spark_imports.
If I'm defining Spark library like: "org.apache.spark" %% "spark-core" % "1.2.0"
PLAY FRAMEWORK CODE:
Build.scala :
val appDependencies = Seq(
jdbc
,"org.apache.spark" %% "spark-streaming" % "1.2.0"
,"org.apache.spark" %% "spark-core" % "1.2.0"
,"org.apache.spark" %% "spark-sql" % "1.2.0"
TestEntity.scala :
package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import models.SparkMain
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
object TestEntity {
val TestEntityPath = "/home/t/PROD/dict/TestEntity .txt"
val TestEntitySpark= SparkMain.sc.textFile(TestEntityPath, 4).cache
val TestEntityData = TestEntitySpark.flatMap(_.split(","))
def getFive() : Seq[String] = {
println("TestEntity.getFive")
TestEntityData.take(5)
}
}
SparkMain.scala :
package models
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.hadoop.conf.Configuration
import org.apache.spark.rdd.RDD
import org.apache.spark.SparkContext._
import org.apache.spark.streaming.{ Seconds, StreamingContext }
import StreamingContext._
import org.apache.spark.sql.SQLContext
import org.apache.spark.SparkConf
object SparkMain {
val driverPort = 8080
val driverHost = "localhost"
val conf = new SparkConf(false) // skip loading external settings
.setMaster("local[4]") // run locally with enough threads
.setAppName("firstSparkApp")
.set("spark.logConf", "true")
.set("spark.driver.port", s"$driverPort")
.set("spark.driver.host", s"$driverHost")
.set("spark.akka.logLifecycleEvents", "true")
val sc = new SparkContext(conf)
}
and controller code, which use Spark stuff :
def test = Action {
implicit req => {
val chk = TestEntity.getFive
Ok("it works")
}
}
..in runtime a have this errors:
[info] o.a.s.SparkContext - Spark configuration:
spark.akka.logLifecycleEvents=true
spark.app.name=firstSparkApp
spark.driver.host=localhost
spark.driver.port=8080
spark.logConf=true
spark.master=local[4]
[warn] o.a.s.u.Utils - Your hostname, uisprk resolves to a loopback address: 127.0.1.1; using 10.0.2.15 instead (on interface eth0)
[warn] o.a.s.u.Utils - Set SPARK_LOCAL_IP if you need to bind to another address
[info] o.a.s.SecurityManager - Changing view acls to: t
[info] o.a.s.SecurityManager - Changing modify acls to: t
[info] o.a.s.SecurityManager - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(t); users with modify permissions: Set(t)
[error] application -
! #6l039e8d5 - Internal server error, for (GET) [/ui] ->
play.api.Application$$anon$1: Execution exception[[RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;]]
at play.api.Application$class.handleError(Application.scala:293) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.DefaultApplication.handleError(Application.scala:399) [play_2.10-2.2.3.jar:2.2.3]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:166) [play_2.10-2.2.3.jar:2.2.3]
at play.core.server.netty.PlayDefaultUpstreamHandler$$anonfun$13$$anonfun$apply$1.applyOrElse(PlayDefaultUpstreamHandler.scala:163) [play_2.10-2.2.3.jar:2.2.3]
at scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:33) [scala-library-2.10.4.jar:na]
at scala.util.Failure$$anonfun$recover$1.apply(Try.scala:185) [scala-library-2.10.4.jar:na]
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
at play.api.mvc.ActionBuilder$$anon$1.apply(Action.scala:314) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4$$anonfun$apply$5.apply(Action.scala:109) ~[play_2.10-2.2.3.jar:2.2.3]
at play.utils.Threads$.withContextClassLoader(Threads.scala:18) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:108) ~[play_2.10-2.2.3.jar:2.2.3]
at play.api.mvc.Action$$anonfun$apply$1$$anonfun$apply$4.apply(Action.scala:107) ~[play_2.10-2.2.3.jar:2.2.3]
Caused by: java.lang.NoSuchMethodError: akka.util.Helpers$.ConfigOps(Lcom/typesafe/config/Config;)Lcom/typesafe/config/Config;
at akka.remote.RemoteSettings.<init>(RemoteSettings.scala:48) ~[akka-remote_2.10-2.3.4-spark.jar:na]
at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:114) ~[akka-remote_2.10-2.3.4-spark.jar:na]
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.7.0_72]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) ~[na:1.7.0_72]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.7.0_72]
at java.lang.reflect.Constructor.newInstance(Constructor.java:526) ~[na:1.7.0_72]
How to tie the library? through dependency or assembled_jar?
Any advice, please.
nosuchmethodeersrror exception is 100 % due to mismatch of jars version at compile time and runtime.
check the versions of jar. Also I have some questions about architecture of your app
Instead of calling spark code from play framework you can also call spark submit from shell scripts which looks better in your case. Even you can do it from your play application. no need to include jar in play app classpath.
The problem with the configuration is Akka dependency of Apache Spark and Play Framework -- they both depend on Akka and, as you've faced it, different and incompatible versions should be resolved at build time with evicted command in sbt.
You may want to use update command and find the reports in target/resolution-cache/reports quite useful.

Resources