Accumulo init - [start.Main] ERROR: initializing the class loader - cloudera

I'm new to Accumulo and trying to install v1.7 on a Cloudera VM.
I have Java 1.7 and HDP 2.2, and Zookeeper is currently running. I've mainly been trying to follow the INSTALL.md without incident and have configured Accumulo however get the following error when trying to initialise:
./bin/accumulo init
2016-02-23 09:24:07,999 [start.Main] ERROR: Problem initializing the class loade r
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.accumulo.start.Main.getClassLoader(Main.java:68)
at org.apache.accumulo.start.Main.main(Main.java:52)
Caused by: java.lang.NoClassDefFoundError: org/apache/commons/logging/LogFactory
at org.apache.commons.vfs2.impl.DefaultFileSystemManager.<init>(DefaultF ileSystemManager.java:120)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.gene rateVfs(AccumuloVFSClassLoader.java:246)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.getC lassLoader(AccumuloVFSClassLoader.java:204)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.apache.commons.logging.LogFacto ry
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass (AccumuloClassLoader.java:281)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
Exception in thread "Thread-0" java.lang.NoClassDefFoundError: org/apache/common s/io/FileUtils
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader.clos e(AccumuloVFSClassLoader.java:406)
at org.apache.accumulo.start.classloader.vfs.AccumuloVFSClassLoader$Accu muloVFSClassLoaderShutdownThread.run(AccumuloVFSClassLoader.java:74)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.io.FileUtils
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at org.apache.accumulo.start.classloader.AccumuloClassLoader$2.loadClass (AccumuloClassLoader.java:281)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 3 more
I've read other posts where this has been put down to a bad setting in accumulo-env.sh however as below I don't see what im missing
if [[ -z $HADOOP_HOME ]] ; then
test -z "$HADOOP_PREFIX" && export HADOOP_PREFIX=/usr/lib/hadoop
else
HADOOP_PREFIX="$HADOOP_HOME"
unset HADOOP_HOME
fi
# hadoop-2.0:
test -z "$HADOOP_CONF_DIR" && export HADOOP_CONF_DIR="$HADOOP_PREFIX/etc/hadoop"
test -z "$ACCUMULO_HOME" && export ACCUMULO_HOME="/etc/accumulo/accumulo-1.7.0"
test -z "$JAVA_HOME" && export JAVA_HOME="/usr/java/jdk1.7.0_67-cloudera"
test -z "$ZOOKEEPER_HOME" && export ZOOKEEPER_HOME=/usr/lib/zookeeper
test -z "$ACCUMULO_LOG_DIR" && export ACCUMULO_LOG_DIR=$ACCUMULO_HOME/logs
if [[ -f ${ACCUMULO_CONF_DIR}/accumulo.policy ]]
then
POLICY="-Djava.security.manager -Djava.security.policy=${ACCUMULO_CONF_DIR}/accumulo.policy"
fi
Furthermore I have the following in my general.classpaths
<property>
<name>general.classpaths</name>
<value>
<!-- Accumulo requirements -->
$ACCUMULO_HOME/lib/accumulo-server.jar,
$ACCUMULO_HOME/lib/accumulo-core.jar,
$ACCUMULO_HOME/lib/accumulo-start.jar,
$ACCUMULO_HOME/lib/accumulo-fate.jar,
$ACCUMULO_HOME/lib/accumulo-proxy.jar,
$ACCUMULO_HOME/lib/[^.].*.jar,
<!-- ZooKeeper requirements -->
$ZOOKEEPER_HOME/zookeeper[^.].*.jar,
<!-- Common Hadoop requirements -->
$HADOOP_CONF_DIR,
<!-- Hadoop 2 requirements --><!--
$HADOOP_PREFIX/share/hadoop/common/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/common/lib/(?!slf4j)[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/hdfs/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/mapreduce/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/yarn/[^.].*.jar,
$HADOOP_PREFIX/share/hadoop/yarn/lib/jersey.*.jar,
--><!-- End Hadoop 2 requirements -->
<!-- HDP 2.0 requirements --><!--
/usr/lib/hadoop/[^.].*.jar,
/usr/lib/hadoop/lib/[^.].*.jar,
/usr/lib/hadoop-hdfs/[^.].*.jar,
/usr/lib/hadoop-mapreduce/[^.].*.jar,
/usr/lib/hadoop-yarn/[^.].*.jar,
/usr/lib/hadoop-yarn/lib/jersey.*.jar,
--><!-- End HDP 2.0 requirements -->
<!-- HDP 2.2 requirements -->
/usr/hdp/current/hadoop-client/[^.].*.jar,
/usr/hdp/current/hadoop-client/lib/(?!slf4j)[^.].*.jar,
/usr/hdp/current/hadoop-hdfs-client/[^.].*.jar,
/usr/hdp/current/hadoop-mapreduce-client/[^.].*.jar,
/usr/hdp/current/hadoop-yarn-client/[^.].*.jar,
/usr/hdp/current/hadoop-yarn-client/lib/jersey.*.jar,
/usr/hdp/current/hive-client/lib/hive-accumulo-handler.jar
/usr/lib/hadoop/lib/commons-io-2.4.jar
<!-- End HDP 2.2 requirements -->
</value>
<description>Classpaths that accumulo checks for updates and class files.</description>
Any help would be appreciated, interestingly I get the same result when trying to run ./bin/accumulo classpath

Accumulo expects to pull the commons-io-2.4.jar from your Hadoop installation. I'm not sure whether CDH is packaging this jar at all, or if your configuration files are just not pointing at it correctly.
You can try examining the output of accumulo classpath to see what the items on the classpath actually expand to. The general.classpath configuration item in accumulo-site.xml is what you will want to inspect/modify.

unset CLASSPATH
I had same problem took me hours to figure it out.

Related

Exception in Starting Ignite inside an OSGi container

Below exception occurred while starting Ignite in Apache karaf by implementing OSGi Bundle Activator
Version:
Ignite : 2.3.0
Karaf : 4.0.9
Steps followed:
karaf#root()>feature:install ignite-core
karaf#root()>feature:install ignite-indexing
karaf#root()>bundle:install -s mvn:org.apache.ignite/ignite-osgi/2.3.0
Exception:
[22:10:55] (err) Failed to start Ignite via OSGi Activator [errMsg=org/h2/index/Index]java.lang.NoClassDefFoundError: org/h2/index/Index
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.ignite.internal.IgniteComponentType.inClassPath(IgniteComponentType.java:153)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start0(IgnitionEx.java:1842)
at org.apache.ignite.internal.IgnitionEx$IgniteNamedInstance.start(IgnitionEx.java:1652)
at org.apache.ignite.internal.IgnitionEx.start0(IgnitionEx.java:1080)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:600)
at org.apache.ignite.internal.IgnitionEx.start(IgnitionEx.java:525)
at org.apache.ignite.Ignition.start(Ignition.java:322)
at org.apache.ignite.osgi.IgniteAbstractOsgiContextActivator.start(IgniteAbstractOsgiContextActivator.java:108)
at org.apache.felix.framework.util.SecureAction.startActivator(SecureAction.java:697)
at org.apache.felix.framework.Felix.activateBundle(Felix.java:2238)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2144)
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1371)
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:308)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.h2.index.Index not found by org.apache.ignite.ignite-core [72]
at org.apache.felix.framework.BundleWiringImpl.findClassOrResourceByDelegation(BundleWiringImpl.java:1550)
at org.apache.felix.framework.BundleWiringImpl.access$200(BundleWiringImpl.java:79)
at org.apache.felix.framework.BundleWiringImpl$BundleClassLoader.loadClass(BundleWiringImpl.java:1958)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 16 more
Please let me know any configuration is missing.

Oozie not starting in Amazon EMR

I tried to install Apache oozie in EMR cluster. I am getting the error. “Error: IO_ERROR : java.net.ConnectException: Connection refused”.
Followed the below link for installation:
http://pkavuri.blogspot.in/2013/08/oozie-installation-is-simplified.html
I got the error after running the below command:
bin/oozie admin -oozie http://localhost:11000/oozie -status
The following steps I did after encountering the error:
Moved the Hadoop and common jar files to the folders
“/oozie-3.3.2/distro/target/oozie-3.3.2-distro/oozie-3.3.2/oozie-server/webapps/oozie/WEB-INF/lib”
and “oozie-3.3.2/distro/target/oozie-3.3.2-distro/oozie-3.3.2/lib/”
Downloaded derby in oozie-3.3.2/libext
The error trace after running the command "tail -100f logs/catalina.out":
ERROR: Oozie could not be started
REASON: java.lang.NoClassDefFoundError:
org/apache/hadoop/util/ReflectionUtils
Stacktrace:
----------------------------------------------------------------- java.lang.NoClassDefFoundError: org/apache/hadoop/util/ReflectionUtils
at org.apache.oozie.service.Services.setServiceInternal(Services.java:359)
at org.apache.oozie.service.Services.(Services.java:108)
at org.apache.oozie.servlet.ServicesLoader.contextInitialized(ServicesLoader.java:38)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused
by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.ReflectionUtils
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
... 27 more
Try creating a libext folder and put all hadoop jars , extjs jars in it . Then run oozie-setup.sh and then oozie-run.sh
In this case based on your logs i guess you are missing hadoop-core.jar.

Error in creating an Executable JAR

I have created a JAR File in the command prompt. While executing the JAR file using the command java -jar MyJAR.jar
I am getting the following error.
Exception in thread "main" java.lang.NoClassDefFoundError.
Caused by: java.lang.ClassNotFoundException: ShipmentData
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: ShipmentData . Program will exit.
Please help me to sort this issue.

running mahout RecommenderJob on EMR

I'm trying to run a RecommenderJob on amazon EMR. I have a jar called SmartJukebox.jar (not runnable) and it contains a class main.TrackRecommander (and that's it).
I created a job flow with the jar:
s3n://smartjukebox/SmartJukebox.jar
and args:
main.TrackRecommander --input s3n://smartjukebox/ratings.csv --output s3n://smartjukebox/output --usersFile s3n://smartjukebox/user.txt.
The class TrackRecommander uses the class RecommenderJob.
I run the job flow and i get this in the error log -
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/mahout/cf/taste/hadoop/item/RecommenderJob
at main.TrackRecommander.main(TrackRecommander.java:136)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
Caused by: java.lang.ClassNotFoundException: org.apache.mahout.cf.taste.hadoop.item.RecommenderJob
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 6 more
now i see that the JVM can't find RecommenderJob and i didn't put RecommenderJob in my jar. I thought EMR would have mahout jars built in, but i can't find anything about that.
what is the solution here?
Thanks.
You're problem is exactly what you say: "I didn't put RecommenderJob in my jar." Unless you put those classes in your JAR, of course it can't be found. Why would EMR have this built in? Add the Mahout ".job" file classes to your JAR first.
You will need to create a job jar which contains all the classes required by the code to run which includes the mahout classes too.
Take a look at
https://github.com/tdunning/MiA
Check how to create a job jar using maven assembly plugin in pom.xml and the job.xml in the src/main/resources directory.
IF you exclude the hadoop classes then you can run it on any hadoop instance.

EJBContainer.createEJBContainer() Fails if another GF instances is running

I have a maven project running a TestNG test that launches an EJBContainer.createEJBContainer() and fails with the following error ONLY if I have another GF 3.1 running.
javax.ejb.EJBException: No EJBContainer provider available
The following providers:
org.glassfish.ejb.embedded.EJBContainerProviderImpl
Returned null from createEJBContainer call.
at javax.ejb.embeddable.EJBContainer.reportError(EJBContainer.java:216)
at javax.ejb.embeddable.EJBContainer.createEJBContainer(EJBContainer.java:146)
Log output indicates Caused by: javax.naming.NameNotFoundException: jdbc
FINE: Exception while invoking class org.glassfish.persistence.jpa.JPADeployer prepare method
java.lang.RuntimeException: javax.naming.NamingException: Lookup failed for 'jdbc/__default' in SerialContext[myEnv={com.sun.enterprise.connectors.jndisuffix=__pm, java.naming.factory.initial=com.sun.enterprise.naming.impl.SerialInitContextFactory, java.naming.factory.state=com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl, java.naming.factory.url.pkgs=com.sun.enterprise.naming} [Root exception is javax.naming.NameNotFoundException: jdbc]
at org.glassfish.persistence.jpa.PersistenceUnitInfoImpl.<init>(PersistenceUnitInfoImpl.java:111)
at org.glassfish.persistence.jpa.PersistenceUnitLoader.loadPU(PersistenceUnitLoader.java:154)
at org.glassfish.persistence.jpa.PersistenceUnitLoader.<init>(PersistenceUnitLoader.java:119)
at org.glassfish.persistence.jpa.JPADeployer$1.visitPUD(JPADeployer.java:213)
at org.glassfish.persistence.jpa.JPADeployer$PersistenceUnitDescriptorIterator.iteratePUDs(JPADeployer.java:486)
at org.glassfish.persistence.jpa.JPADeployer.createEMFs(JPADeployer.java:220)
at org.glassfish.persistence.jpa.JPADeployer.prepare(JPADeployer.java:166)
at com.sun.enterprise.v3.server.ApplicationLifecycle.prepareModule(ApplicationLifecycle.java:870)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:410)
at com.sun.enterprise.v3.server.ApplicationLifecycle.deploy(ApplicationLifecycle.java:240)
at org.glassfish.kernel.embedded.EmbeddedDeployerImpl.deploy(EmbeddedDeployerImpl.java:193)
at org.glassfish.ejb.embedded.EJBContainerImpl.deploy(EJBContainerImpl.java:137)
at org.glassfish.ejb.embedded.EJBContainerProviderImpl.createEJBContainer(EJBContainerProviderImpl.java:132)
at javax.ejb.embeddable.EJBContainer.createEJBContainer(EJBContainer.java:127)
at org.primewest.persistence.service.BasicPersistenceServiceBeanTest.setup(BasicPersistenceServiceBeanTest.java:39)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:76)
at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:525)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:202)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:130)
at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:173)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:105)
at org.testng.TestRunner.runWorkers(TestRunner.java:1147)
at org.testng.TestRunner.privateRun(TestRunner.java:749)
at org.testng.TestRunner.run(TestRunner.java:600)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:317)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:312)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:274)
at org.testng.SuiteRunner.run(SuiteRunner.java:223)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1039)
at org.testng.TestNG.runSuitesLocally(TestNG.java:964)
at org.testng.TestNG.run(TestNG.java:900)
at org.apache.maven.surefire.testng.TestNGExecutor.run(TestNGExecutor.java:74)
at org.apache.maven.surefire.testng.TestNGXmlTestSuite.execute(TestNGXmlTestSuite.java:92)
at org.apache.maven.surefire.Surefire.run(Surefire.java:177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.maven.surefire.booter.SurefireBooter.runSuitesInProcess(SurefireBooter.java:345)
at org.apache.maven.surefire.booter.SurefireBooter.main(SurefireBooter.java:1009)
Caused by: javax.naming.NamingException: Lookup failed for 'jdbc/__default' in SerialContext[myEnv={com.sun.enterprise.connectors.jndisuffix=__pm, java.naming.factory.initial=com.sun.enterprise.naming.impl.SerialInitContextFactory, java.naming.factory.state=com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl, java.naming.factory.url.pkgs=com.sun.enterprise.naming} [Root exception is javax.naming.NameNotFoundException: jdbc]
at com.sun.enterprise.naming.impl.SerialContext.lookup(SerialContext.java:518)
at com.sun.enterprise.naming.impl.SerialContext.lookup(SerialContext.java:455)
at javax.naming.InitialContext.lookup(InitialContext.java:392)
at javax.naming.InitialContext.lookup(InitialContext.java:392)
at com.sun.appserv.connectors.internal.api.ResourceNamingService.lookup(ResourceNamingService.java:221)
at com.sun.enterprise.connectors.service.ConnectorResourceAdminServiceImpl.lookup(ConnectorResourceAdminServiceImpl.java:225)
at com.sun.enterprise.connectors.ConnectorRuntime.lookupPMResource(ConnectorRuntime.java:462)
at org.glassfish.persistence.common.PersistenceHelper.lookupPMResource(PersistenceHelper.java:63)
at org.glassfish.persistence.jpa.ProviderContainerContractInfoBase.lookupDataSource(ProviderContainerContractInfoBase.java:71)
at org.glassfish.persistence.jpa.PersistenceUnitInfoImpl.<init>(PersistenceUnitInfoImpl.java:108)
... 45 more
Caused by: javax.naming.NameNotFoundException: jdbc
at com.sun.enterprise.naming.impl.TransientContext.resolveContext(TransientContext.java:310)
at com.sun.enterprise.naming.impl.TransientContext.lookup(TransientContext.java:218)
at com.sun.enterprise.naming.impl.SerialContextProviderImpl.lookup(SerialContextProviderImpl.java:77)
at com.sun.enterprise.naming.impl.LocalSerialContextProviderImpl.lookup(LocalSerialContextProviderImpl.java:119)
at com.sun.enterprise.naming.impl.SerialContext.lookup(SerialContext.java:505)
... 54 more
If I shutdown the running GF instance, then the EJBContainer starts and tests execute correctly. We use Jenkins (Hudson) for CI which runs in GF and the build fails. This also causes issues with local development as I have to stop GF to run the tests.
I copied the domain.xml from a clean GF install, changed all the ports (to avoid conflict with running GF instance) and placed it in ./src/test/resources/org/glassfish/embed/.
Here's how I'm creating the EBJContainer.
Map<String, Object> properties = new HashMap<String, Object>();
properties.put(EJBContainer.MODULES, new File[]{new File("target/classes"), new File("target/test-classes")});
ejbContainer = EJBContainer.createEJBContainer(properties);
ctx = ejbContainer.getContext();
Maven dependency.
<dependency>
<groupId>org.glassfish.extras</groupId>
<artifactId>glassfish-embedded-all</artifactId>
<version>3.1</version>
<scope>test</scope>
</dependency>
I feel like I'm missing something simple. Thanks in advance.
When I was started using embeddable EJB container, all of the references I found mentioned using either a pre-installed GF or copying the domain.xml from an installed GF instance (which is what I did). I later stumbled across http://embedded-glassfish.java.net/domain.xml and I no longer have this issue.

Resources