I want to use the createEJBStubs command, which is described here.
But when I apply it to my .ear file deployed on the server I get the following exception:
CNTR9258E: Error: Unexpected exception "error in opening zip file" occurred.
Has anyone come across this problem? What can be done to create the stubs successfully?
Here's what I get using the -verbose option:
CNTR9258E: Error: Unexpected exception "error in opening zip file" occurred.
java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:203)
at java.util.jar.JarFile.<init>(JarFile.java:132)
at java.util.jar.JarFile.<init>(JarFile.java:97)
at com.ibm.ws.deploy.tools.CreateEJBStubsCommand.main(CreateEJBStubsCommand.java:279)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:592)
at com.ibm.wsspi.bootstrap.WSLauncher.launchMain(WSLauncher.java:183)
at com.ibm.wsspi.bootstrap.WSLauncher.main(WSLauncher.java:90)
at com.ibm.wsspi.bootstrap.WSLauncher.run(WSLauncher.java:72)
at org.eclipse.core.internal.runtime.PlatformActivator$1.run(PlatformActivator.java:78)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:92)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:68)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:400)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:177)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:592)
at org.eclipse.core.launcher.Main.invokeFramework(Main.java:340)
at org.eclipse.core.launcher.Main.basicRun(Main.java:282)
at org.eclipse.core.launcher.Main.run(Main.java:981)
at com.ibm.wsspi.bootstrap.WSPreLauncher.launchEclipse(WSPreLauncher.java:339)
at com.ibm.wsspi.bootstrap.WSPreLauncher.main(WSPreLauncher.java:94)
Command Failed
I use this syntax:
call <WAS's Path>\createEJBStubs.bat <file-name.jar> -cp <project's dependecies>;
P.S.: Important all dependencies stay in the same folder to be run the command line.
Related
Can anybody tell me what could be causing this exception?. Because it is working fine in testing environment and not in the live servers .
I tried checking the differences of test and existing prod war file, but i did not find any differences.
The major change I did was converted project from ANT to MAVEN, but not sure how this is related to this exception.
java.io.FileNotFoundException: http://localhost:8888/getreport
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1890)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1492)
at com.sample.SimpleTest.webcontroller.SimpleController.getreport(SimpleController.java:120)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
I am trying to run the OWBClient.sh with the command in UNIX and I am getting the following error:
" The JVM option is invalid: -XX: MaxPermSize=256M. Could not create the Java Virtual Machine."
OWB Version: 10.2.0.5
Oracle Database : 10g
UName: AIX
When exploring more on the above issue, I got an information that OWB 10.2 could not be run in AIX. It would be helpful if anyone can can shed some light on whether OWB 10.2.0.5 can be run on AIX?
Editing my post to share an update.I have tried to execute the owbclient.sh file by removing the -XX:MaxPermSize=256M part. I am now getting a different kind of error...
$ ./owbclient.sh
[Launcher]: Error! Cannot find and load the jar file '../../../jdk/jre/lib/ rt.jar'. OWB application may not be launched due to this error.
[Launcher]: Error! Cannot find and load the jar file '../../../jdk/jre/lib/ jce.jar'. OWB application may not be launched due to this error.
StaticLoader: Setting locale to en__
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:85)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:58)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:60)
at java.lang.reflect.Method.invoke(Method.java:391)
at Launcher.main(Launcher.java:167)
Caused by: java.lang.InternalError: Can't connect to X11 window server using ':0 .0' as the value of the DISPLAY variable.
at sun.awt.X11GraphicsEnvironment.initDisplay(Native Method)
at sun.awt.X11GraphicsEnvironment.(X11GraphicsEnvironment.java:1 75)
at java.lang.Class.forName1(Native Method)
at java.lang.Class.forName(Class.java:180)
at java.awt.GraphicsEnvironment.getLocalGraphicsEnvironment(GraphicsEnvi ronment.java:91)
at sun.awt.motif.MToolkit.(MToolkit.java:124)
at java.lang.Class.forName1(Native Method)
at java.lang.Class.forName(Class.java:180)
at java.awt.Toolkit$2.run(Toolkit.java:796)
at java.security.AccessController.doPrivileged1(Native Method)
at java.security.AccessController.doPrivileged(AccessController.java:287 )
at java.awt.Toolkit.getDefaultToolkit(Toolkit.java:787)
at java.awt.Component.getToolkitImpl(Component.java:873)
at java.awt.Component.getToolkit(Component.java:857)
at java.awt.Component.createImage(Component.java:2729)
at oracle.wh.ui.common.CommonUtils.getIcon(CommonUtils.java:622)
at oracle.wh.ui.framework.beans.splash.SplashBean.setImageLoc(SplashBean .java:111)
at oracle.wh.ui.framework.beans.splash.SplashBean.(SplashBean.java :42)
at oracle.wh.ui.framework.StaticLoader.main(StaticLoader.java:67)
... 6 more
java.lang.reflect.InvocationTargetException
$
I tried to install Apache oozie in EMR cluster. I am getting the error. “Error: IO_ERROR : java.net.ConnectException: Connection refused”.
Followed the below link for installation:
http://pkavuri.blogspot.in/2013/08/oozie-installation-is-simplified.html
I got the error after running the below command:
bin/oozie admin -oozie http://localhost:11000/oozie -status
The following steps I did after encountering the error:
Moved the Hadoop and common jar files to the folders
“/oozie-3.3.2/distro/target/oozie-3.3.2-distro/oozie-3.3.2/oozie-server/webapps/oozie/WEB-INF/lib”
and “oozie-3.3.2/distro/target/oozie-3.3.2-distro/oozie-3.3.2/lib/”
Downloaded derby in oozie-3.3.2/libext
The error trace after running the command "tail -100f logs/catalina.out":
ERROR: Oozie could not be started
REASON: java.lang.NoClassDefFoundError:
org/apache/hadoop/util/ReflectionUtils
Stacktrace:
----------------------------------------------------------------- java.lang.NoClassDefFoundError: org/apache/hadoop/util/ReflectionUtils
at org.apache.oozie.service.Services.setServiceInternal(Services.java:359)
at org.apache.oozie.service.Services.(Services.java:108)
at org.apache.oozie.servlet.ServicesLoader.contextInitialized(ServicesLoader.java:38)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4206)
at org.apache.catalina.core.StandardContext.start(StandardContext.java:4705)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:799)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:779)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:601)
at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:943)
at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:778)
at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:504)
at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1317)
at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:324)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:142)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1065)
at org.apache.catalina.core.StandardHost.start(StandardHost.java:840)
at org.apache.catalina.core.ContainerBase.start(ContainerBase.java:1057)
at org.apache.catalina.core.StandardEngine.start(StandardEngine.java:463)
at org.apache.catalina.core.StandardService.start(StandardService.java:525)
at org.apache.catalina.core.StandardServer.start(StandardServer.java:754)
at org.apache.catalina.startup.Catalina.start(Catalina.java:595)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:289)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:414) Caused
by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.ReflectionUtils
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1680)
at org.apache.catalina.loader.WebappClassLoader.loadClass(WebappClassLoader.java:1526)
... 27 more
Try creating a libext folder and put all hadoop jars , extjs jars in it . Then run oozie-setup.sh and then oozie-run.sh
In this case based on your logs i guess you are missing hadoop-core.jar.
I have created a JAR File in the command prompt. While executing the JAR file using the command java -jar MyJAR.jar
I am getting the following error.
Exception in thread "main" java.lang.NoClassDefFoundError.
Caused by: java.lang.ClassNotFoundException: ShipmentData
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
Could not find the main class: ShipmentData . Program will exit.
Please help me to sort this issue.
Does anyone know if there is a problem using Amazon's S3Distcp tool with MapR running on EMR? I'm trying to use it, but keep getting the following exception in /mnt/var/log/hadoop/steps:
Exception in thread "main" java.lang.RuntimeException: Unable to delete directory hdfs:/tmp/e9333a37-f400-4982-9687-326e33d9b37d/files
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.deleteRecursive(S3DistCp.java:606)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:464)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.run(S3DistCp.java:216)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at com.amazon.external.elasticmapreduce.s3distcp.Main.main(Main.java:12)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.hadoop.util.RunJar.main(RunJar.java:186)
Caused by: java.io.IOException: Incomplete HDFS URI, no host: hdfs:/tmp/e9333a37-f400-4982-9687-326e33d9b37d/files
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:85)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1416)
at org.apache.hadoop.fs.FileSystem.access$100(FileSystem.java:69)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1450)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1432)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:232)
at com.amazon.external.elasticmapreduce.s3distcp.S3DistCp.deleteRecursive(S3DistCp.java:603)
the command line I'm using to submit the job step is:
elastic-mapreduce --jobflow $JOB_ID --jar s3://us-east-1.elasticmapreduce/libs/s3distcp/1.latest/s3distcp.jar \
--args '--src,s3n://PVData/raw, \
--dest,/PVData/raw'
For the --dest argument I have tried maprfs:///PVData/raw and hdfs:///PVData/raw as well and they don't work either.
I got an answer to this question over on the MapR forum (http://bit.ly/S7gzcv). The problem was I needed to specify the temp directory as maprfs:///tmp using the --tmpDir argument to s3distcp