I upgraded from CDH 4.4.0-1.cdh4.4.0.p0.39 to CDH 4.5.0-1.cdh4.5.0.p0.30. I had distcp working in prior version from hdfs to s3:
hadoop distcp -i -update -p hdfs://NN:8020/path/to/directory/folder/ s3n://accesskeyid:keypass#mybucket/directory/;
when upgrade I'm getting a few weird errors and warnings now:
WARN httpclient.HttpMethodReleaseInputStream: Attempting to release HttpMethod in finalize() as its response data stream has gone out of scope. This attempt will not always succeed and cannot be relied upon! Please ensure response data streams are always fully consumed or closed to avoid HTTP connection starvation.
13/12/10 11:26:27 WARN httpclient.HttpMethodReleaseInputStream: Successfully released HttpMethod in finalize(). You were lucky this time... Please ensure response data streams are always fully consumed or closed.
And errors:
Task attempt_201312042223_7900_m_000003_2 failed to report status for 600 seconds. Killing!
attempt_201312042223_7900_m_000003_2: 2013-12-10 10:52:32
attempt_201312042223_7900_m_000003_2: Full thread dump Java HotSpot(TM) 64-Bit Server VM (20.6-b01 mixed mode):
attempt_201312042223_7900_m_000003_2: "org.apache.hadoop.hdfs.PeerCache#4ed9f47" daemon prio=10 tid=0x00007f96ccc8c000 nid=0x2801 waiting on condition [0x00007f96c2f0e000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (sleeping)
attempt_201312042223_7900_m_000003_2: at java.lang.Thread.sleep(Native Method)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.hdfs.PeerCache.run(PeerCache.java:252)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.hdfs.PeerCache.access$000(PeerCache.java:39)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.hdfs.PeerCache$1.run(PeerCache.java:135)
attempt_201312042223_7900_m_000003_2: at java.lang.Thread.run(Thread.java:662)
attempt_201312042223_7900_m_000003_2: "communication thread" daemon prio=10 tid=0x00007f96ccc87800 nid=0x27f4 in Object.wait() [0x00007f96c327f000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (on object monitor)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Native Method)
attempt_201312042223_7900_m_000003_2: - waiting on <0x00000000fcb34ee0> (a java.lang.Object)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.Task$TaskReporter.run(Task.java:662)
attempt_201312042223_7900_m_000003_2: - locked <0x00000000fcb34ee0> (a java.lang.Object)
attempt_201312042223_7900_m_000003_2: at java.lang.Thread.run(Thread.java:662)
attempt_201312042223_7900_m_000003_2: "Timer thread for monitoring jvm" daemon prio=10 tid=0x00007f96ccc57800 nid=0x27ed in Object.wait() [0x00007f96c3110000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (on object monitor)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Native Method)
attempt_201312042223_7900_m_000003_2: - waiting on <0x00000000fcad8078> (a java.util.TaskQueue)
attempt_201312042223_7900_m_000003_2: at java.util.TimerThread.mainLoop(Timer.java:509)
attempt_201312042223_7900_m_000003_2: - locked <0x00000000fcad8078> (a java.util.TaskQueue)
attempt_201312042223_7900_m_000003_2: at java.util.TimerThread.run(Timer.java:462)
attempt_201312042223_7900_m_000003_2: "IPC Parameter Sending Thread #0" daemon prio=10 tid=0x00007f96cca80800 nid=0x27e4 waiting on condition [0x00007f96c33b5000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (parking)
attempt_201312042223_7900_m_000003_2: at sun.misc.Unsafe.park(Native Method)
attempt_201312042223_7900_m_000003_2: - parking to wait for <0x00000000fcae0080> (a java.util.concurrent.SynchronousQueue$TransferStack)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:196)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.SynchronousQueue$TransferStack.awaitFulfill(SynchronousQueue.java:424)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.SynchronousQueue$TransferStack.transfer(SynchronousQueue.java:323)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.SynchronousQueue.poll(SynchronousQueue.java:874)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:945)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907)
attempt_201312042223_7900_m_000003_2: at java.lang.Thread.run(Thread.java:662)
attempt_201312042223_7900_m_000003_2: "IPC Client (1065524847) connection to /127.0.0.1:38248 from job_201312042223_7900" daemon prio=10 tid=0x00007f96cca98800 nid=0x27e3 in Object.wait() [0x00007f96c34b6000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (on object monitor)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Native Method)
attempt_201312042223_7900_m_000003_2: - waiting on <0x00000000fcac0188> (a org.apache.hadoop.ipc.Client$Connection)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.ipc.Client$Connection.waitForWork(Client.java:803)
attempt_201312042223_7900_m_000003_2: - locked <0x00000000fcac0188> (a org.apache.hadoop.ipc.Client$Connection)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.ipc.Client$Connection.run(Client.java:846)
attempt_201312042223_7900_m_000003_2: "Thread for syncLogs" daemon prio=10 tid=0x00007f96cca5c800 nid=0x27e2 waiting on condition [0x00007f96c36bf000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: TIMED_WAITING (sleeping)
attempt_201312042223_7900_m_000003_2: at java.lang.Thread.sleep(Native Method)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.Child$3.run(Child.java:156)
attempt_201312042223_7900_m_000003_2: "Low Memory Detector" daemon prio=10 tid=0x00007f96cc0be000 nid=0x27dc runnable [0x0000000000000000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: RUNNABLE
attempt_201312042223_7900_m_000003_2: "C2 CompilerThread1" daemon prio=10 tid=0x00007f96cc0bb800 nid=0x27db waiting on condition [0x0000000000000000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: RUNNABLE
attempt_201312042223_7900_m_000003_2: "C2 CompilerThread0" daemon prio=10 tid=0x00007f96cc0b9000 nid=0x27da waiting on condition [0x0000000000000000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: RUNNABLE
attempt_201312042223_7900_m_000003_2: "Signal Dispatcher" daemon prio=10 tid=0x00007f96cc0b6800 nid=0x27d9 waiting on condition [0x0000000000000000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: RUNNABLE
attempt_201312042223_7900_m_000003_2: "Finalizer" daemon prio=10 tid=0x00007f96cc09a000 nid=0x27d8 in Object.wait() [0x00007f96c89f8000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: WAITING (on object monitor)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Native Method)
attempt_201312042223_7900_m_000003_2: - waiting on <0x00000000fcac0480> (a java.lang.ref.ReferenceQueue$Lock)
attempt_201312042223_7900_m_000003_2: at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:118)
attempt_201312042223_7900_m_000003_2: - locked <0x00000000fcac0480> (a java.lang.ref.ReferenceQueue$Lock)
attempt_201312042223_7900_m_000003_2: at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:134)
attempt_201312042223_7900_m_000003_2: at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:159)
attempt_201312042223_7900_m_000003_2: "Reference Handler" daemon prio=10 tid=0x00007f96cc098000 nid=0x27d7 in Object.wait() [0x00007f96c8af9000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: WAITING (on object monitor)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Native Method)
attempt_201312042223_7900_m_000003_2: - waiting on <0x00000000fcad0070> (a java.lang.ref.Reference$Lock)
attempt_201312042223_7900_m_000003_2: at java.lang.Object.wait(Object.java:485)
attempt_201312042223_7900_m_000003_2: at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:116)
attempt_201312042223_7900_m_000003_2: - locked <0x00000000fcad0070> (a java.lang.ref.Reference$Lock)
attempt_201312042223_7900_m_000003_2: "main" prio=10 tid=0x00007f96cc00e800 nid=0x27be waiting on condition [0x00007f96d3837000]
attempt_201312042223_7900_m_000003_2: java.lang.Thread.State: WAITING (parking)
attempt_201312042223_7900_m_000003_2: at sun.misc.Unsafe.park(Native Method)
attempt_201312042223_7900_m_000003_2: - parking to wait for <0x00000000f7669c00> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.locks.LockSupport.park(LockSupport.java:156)
attempt_201312042223_7900_m_000003_2: at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:1987)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.conn.tsccm.WaitingThread.await(WaitingThread.java:158)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.conn.tsccm.ConnPoolByRoute.getEntryBlocking(ConnPoolByRoute.java:402)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.conn.tsccm.ConnPoolByRoute$1.getPoolEntry(ConnPoolByRoute.java:299)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.conn.tsccm.ThreadSafeClientConnManager$1.getConnection(ThreadSafeClientConnManager.java:242)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:456)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:906)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.apache.http.impl.client.AbstractHttpClient.execute(AbstractHttpClient.java:805)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.impl.rest.httpclient.RestStorageService.performRequest(RestStorageService.java:334)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.impl.rest.httpclient.RestStorageService.performRequest(RestStorageService.java:281)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.impl.rest.httpclient.RestStorageService.performRestGet(RestStorageService.java:981)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.impl.rest.httpclient.RestStorageService.getObjectImpl(RestStorageService.java:2150)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.impl.rest.httpclient.RestStorageService.getObjectImpl(RestStorageService.java:2087)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.StorageService.getObject(StorageService.java:1140)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.S3Service.getObject(S3Service.java:2583)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.S3Service.getObject(S3Service.java:84)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.StorageService.getObject(StorageService.java:525)
attempt_201312042223_7900_m_000003_2: at com.cloudera.org.jets3t.service.S3Service.getObject(S3Service.java:1377)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.fs.s3native.Jets3tNativeFileSystemStore.retrieveMetadata(Jets3tNativeFileSystemStore.java:118)
attempt_201312042223_7900_m_000003_2: at sun.reflect.GeneratedMethodAccessor2.invoke(Unknown Source)
attempt_201312042223_7900_m_000003_2: at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
attempt_201312042223_7900_m_000003_2: at java.lang.reflect.Method.invoke(Method.java:597)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.fs.s3native.$Proxy11.retrieveMetadata(Unknown Source)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.fs.s3native.NativeS3FileSystem.getFileStatus(NativeS3FileSystem.java:414)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1378)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.tools.DistCp$CopyFilesMapper.rename(DistCp.java:484)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.tools.DistCp$CopyFilesMapper.copy(DistCp.java:461)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:547)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.tools.DistCp$CopyFilesMapper.map(DistCp.java:314)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
attempt_201312042223_7900_m_000003_2: at java.security.AccessController.doPrivileged(Native Method)
attempt_201312042223_7900_m_000003_2: at javax.security.auth.Subject.doAs(Subject.java:396)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
attempt_201312042223_7900_m_000003_2: at org.apache.hadoop.mapred.Child.main(Child.java:262)
attempt_201312042223_7900_m_000003_2: "VM Thread" prio=10 tid=0x00007f96cc091800 nid=0x27d6 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#0 (ParallelGC)" prio=10 tid=0x00007f96cc021800 nid=0x27bf runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#1 (ParallelGC)" prio=10 tid=0x00007f96cc023800 nid=0x27c0 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#2 (ParallelGC)" prio=10 tid=0x00007f96cc025800 nid=0x27c1 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#3 (ParallelGC)" prio=10 tid=0x00007f96cc027000 nid=0x27c2 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#4 (ParallelGC)" prio=10 tid=0x00007f96cc029000 nid=0x27c3 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#5 (ParallelGC)" prio=10 tid=0x00007f96cc02b000 nid=0x27c4 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#6 (ParallelGC)" prio=10 tid=0x00007f96cc02c800 nid=0x27c5 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#7 (ParallelGC)" prio=10 tid=0x00007f96cc02e800 nid=0x27c6 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#8 (ParallelGC)" prio=10 tid=0x00007f96cc030800 nid=0x27c7 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#9 (ParallelGC)" prio=10 tid=0x00007f96cc032000 nid=0x27c8 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#10 (ParallelGC)" prio=10 tid=0x00007f96cc034000 nid=0x27c9 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#11 (ParallelGC)" prio=10 tid=0x00007f96cc036000 nid=0x27ca runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#12 (ParallelGC)" prio=10 tid=0x00007f96cc037800 nid=0x27cb runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#13 (ParallelGC)" prio=10 tid=0x00007f96cc039800 nid=0x27cc runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#14 (ParallelGC)" prio=10 tid=0x00007f96cc03b800 nid=0x27cd runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#15 (ParallelGC)" prio=10 tid=0x00007f96cc03d000 nid=0x27ce runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#16 (ParallelGC)" prio=10 tid=0x00007f96cc03f000 nid=0x27cf runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#17 (ParallelGC)" prio=10 tid=0x00007f96cc041000 nid=0x27d0 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#18 (ParallelGC)" prio=10 tid=0x00007f96cc042800 nid=0x27d1 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#19 (ParallelGC)" prio=10 tid=0x00007f96cc044800 nid=0x27d2 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#20 (ParallelGC)" prio=10 tid=0x00007f96cc046800 nid=0x27d3 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#21 (ParallelGC)" prio=10 tid=0x00007f96cc048000 nid=0x27d4 runnable
attempt_201312042223_7900_m_000003_2: "GC task thread#22 (ParallelGC)" prio=10 tid=0x00007f96cc04a000 nid=0x27d5 runnable
attempt_201312042223_7900_m_000003_2: "VM Periodic Task Thread" prio=10 tid=0x00007f96cc0d1000 nid=0x27dd waiting on condition
attempt_201312042223_7900_m_000003_2: JNI global references: 1678
attempt_201312042223_7900_m_000003_2: Heap
attempt_201312042223_7900_m_000003_2: PSYoungGen total 191168K, used 134794K [0x00000000f2ab0000, 0x0000000100000000, 0x0000000100000000)
attempt_201312042223_7900_m_000003_2: eden space 163904K, 78% used [0x00000000f2ab0000,0x00000000fa91fef0,0x00000000fcac0000)
attempt_201312042223_7900_m_000003_2: from space 27264K, 19% used [0x00000000fcac0000,0x00000000fcff2af8,0x00000000fe560000)
attempt_201312042223_7900_m_000003_2: to space 27264K, 0% used [0x00000000fe560000,0x00000000fe560000,0x0000000100000000)
attempt_201312042223_7900_m_000003_2: PSOldGen total 436928K, used 0K [0x00000000d8000000, 0x00000000f2ab0000, 0x00000000f2ab0000)
attempt_201312042223_7900_m_000003_2: object space 436928K, 0% used [0x00000000d8000000,0x00000000d8000000,0x00000000f2ab0000)
attempt_201312042223_7900_m_000003_2: PSPermGen total 28160K, used 28023K [0x00000000d2e00000, 0x00000000d4980000, 0x00000000d8000000)
attempt_201312042223_7900_m_000003_2: object space 28160K, 99% used [0x00000000d2e00000,0x00000000d495dd20,0x00000000d4980000)
I retried same dataset from another cluster w/ the prior version and I am having no issues. Was something changed where distcp function now no longer properly closes http connection using jets3t?
Any help here would be greatly appreciated.
Related
My objective:
Use NiFi running on a HDF docker container to store data into HBase running on an HDP docker container.
Progress:
I am running two docker containers: NiFi and HBase. I have configured NiFi's PutHBaseJSON processor to write data to HBase (puthbasejson_configuration.png).
Below are the configurations I updated in the processor:
PutHBaseJSON = HBase_1_1_2_ClientService
Table Name = publictrans
Row Identifier Field Name = Vehicle_ID
Row Identifier Encoding Strategy = String
Column Family = trafficpatterns
Batch Size = 25
Complex Field Strategy = Text
Field Encoding Strategy = String
HBase Client Service
I also configured the NiFi's hbase client service on that processor, so NiFi knows which IP address Zookeeper is located at to ask Zookeeper to tell it where HBase Master is (hbaseclientservice_configuration.png).
Configure Controller Service for HBaseClient:
ZooKeeper Quorum = 172.25.0.3
ZooKeeper Client Port = 2181
ZooKeeper ZNode Parent = /hbase-unsecure
HBase Client Retries = 1
Problem:
The problem I am facing is that NiFi is unable to make the connection to HBase Master. I get the following message: failed to invoke "#OnEnabled method due to ... hbase.client.RetriesExhausted Exception ... hbase.MasterNotRunningException ... java.net.ConnectException: Connection refused." Visual of hbaseclientservice at (hbaseMasterNotRunningException Stack Trace).
Configurations I made to troubleshoot the problem:
In HDF container, I updated /etc/hosts with 172.25.0.3 -> hdp.hortonworks.com. In HDP container, I updated the hosts file with 172.25.0.2 -> hdf.hortonworks.com. So both containers are aware of each others hostnames.
I port forwarded the needed ports for NiFi, Zookeeper and HBase when I built the HDF and HDP containers. I checked if all ports on HBase were exposed on the HDP container and the image shows all the ports HDP is listening in on including HBase's ports (ports_hdp_listening_on.png). Here is an image of all ports needed by HBase, I filtered for port keyword in Ambari (hbase_ports_needed.png).
16000 and 16020 ports both looked suspicious since all others had pattern :::port but those two ports had some wording preceding it. So, I checked if I could make the connection to HDP from HDF using telnet 172.25.0.3 16000 and received the output:
Trying 172.25.0.3...
Connected to 172.25.0.3.
Escape character is '^]'.
So I was able to connect to HDP container.
hbaseMasterNotRunningException Stack Trace:
2017-01-25 22:23:03,342 ERROR [StandardProcessScheduler Thread-7] o.a.n.c.s.StandardControllerServiceNode HBase_1_1_2_ClientService[id=d3eaf393-0159-1000-ffff-ffffa95f1940] Failed to invoke #OnEnabled method due to org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
Wed Jan 25 22:23:03 UTC 2017, RpcRetryingCaller{globalStartTime=1485382983338, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.net.ConnectException: Connection refused
2017-01-25 22:23:03,348 ERROR [StandardProcessScheduler Thread-7] o.a.n.c.s.StandardControllerServiceNode
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
Wed Jan 25 22:23:03 UTC 2017, RpcRetryingCaller{globalStartTime=1485382983338, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.net.ConnectException: Connection refused
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147) ~[na:na]
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917) ~[na:na]
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:413) ~[na:na]
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:397) ~[na:na]
at org.apache.nifi.hbase.HBase_1_1_2_ClientService.onEnabled(HBase_1_1_2_ClientService.java:187) ~[na:na]
at sun.reflect.GeneratedMethodAccessor568.invoke(Unknown Source) ~[na:na]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.8.0_111]
at java.lang.reflect.Method.invoke(Method.java:498) ~[na:1.8.0_111]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:137) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:125) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotations(ReflectionUtils.java:70) ~[na:na]
at org.apache.nifi.util.ReflectionUtils.invokeMethodsWithAnnotation(ReflectionUtils.java:47) ~[na:na]
at org.apache.nifi.controller.service.StandardControllerServiceNode$2.run(StandardControllerServiceNode.java:345) ~[na:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_111]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180) [na:1.8.0_111]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_111]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_111]
at java.lang.Thread.run(Thread.java:745) [na:1.8.0_111]
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.net.ConnectException: Connection refused
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1533) ~[na:na]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553) ~[na:na]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704) ~[na:na]
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38) ~[na:na]
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124) ~[na:na]
... 19 common frames omitted
Caused by: com.google.protobuf.ServiceException: java.net.ConnectException: Connection refused
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:223) ~[na:na]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287) ~[na:na]
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.isMasterRunning(MasterProtos.java:50918) ~[na:na]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.isMasterRunning(ConnectionManager.java:1564) ~[na:na]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1502) ~[na:na]
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524) ~[na:na]
... 23 common frames omitted
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method) ~[na:1.8.0_111]
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717) ~[na:1.8.0_111]
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206) ~[na:na]
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531) ~[na:na]
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495) ~[na:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupConnection(RpcClientImpl.java:424) ~[na:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.setupIOstreams(RpcClientImpl.java:748) ~[na:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.writeRequest(RpcClientImpl.java:920) ~[na:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl$Connection.tracedWriteRequest(RpcClientImpl.java:889) ~[na:na]
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1222) ~[na:na]
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213) ~[na:na]
... 28 common frames omitted
2017-01-25 22:23:03,348 ERROR [StandardProcessScheduler Thread-7] o.a.n.c.s.StandardControllerServiceNode Failed to invoke #OnEnabled method of HBase_1_1_2_ClientService[id=d3eaf393-0159-1000-ffff-ffffa95f1940] due to org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after attempts=1, exceptions:
Wed Jan 25 22:23:03 UTC 2017, RpcRetryingCaller{globalStartTime=1485382983338, pause=100, retries=1}, org.apache.hadoop.hbase.MasterNotRunningException: com.google.protobuf.ServiceException: java.net.ConnectException: Connection refused
I am currently still dealing with problem:
Has anyone setup NiFi HDF docker container to store data into HBase HDP docker container?
I 'd like to use Artifactory Pro (version 4.14) as a proxy for several an external P2 repository. My idea is that all request in our Maven Tycho build go over Artifactory's proxy P2 repository.
Therefore, I create a remote P2 repository for the URL http://download.eclipse.org/releases/mars/ in Artifactory and configure this P2 repository in my Maven Tycho build (whole configuration can be found here). Unfortunalty, my build fails because of java.net.ConnectException: Connection timed out: connect (whole error log , see below ).
My investigsation reveals that the original P2 repository is a composite P2 repository and one of its children has http://download.eclipse.org/technology/epp/packages/mars/ as location URL (you can find the children locations in the compositeContent.xml). Maven Tycho tries to connect this child location, directly. My expectation is that Artifactory would replace this URL by internal cached variant, so that the client doesn't go directly to the remote child url (like for example Sonatype Nexus does it).
Is this only configuration issue? If this is so, where I can configure it in Artifactory?
Whole error log:
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building target-definition-dsl-example 1.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # target-definition-dsl-example ---
[INFO]
[INFO] --- tycho-eclipserun-plugin:0.25.0:eclipse-run (default) # target-definition-dsl-example ---
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/org.eclipse.updates.mars.201602261000/ (142B)
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/org.eclipse.updates.mars.201602261000/ (142B)
[INFO] Adding repository http://www-artifactory:8081/artifactory/org.eclipse.updates.mars.201602261000
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/ (128B)
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/ (128B)
[INFO] Adding repository http://www-artifactory:8081/artifactory/obeo-releng-tools-releases
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/2.0/ (128B)
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/2.0/ (128B)
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/2.1/ (128B)
[INFO] Fetching p2.index from http://www-artifactory:8081/artifactory/obeo-releng-tools-releases/2.1/ (128B)
[INFO] Expected eclipse log file: C:\Users\10520312\workspace\target-definition-dsl-example\target\eclipserun-work\data\.metadata\.log
[INFO] Command line:
[C:\devel\Java\jdk1.8.0_72_x64\jre\bin\java.exe, -jar, C:\Users\10520312\.m2\repository\p2\osgi\bundle\org.eclipse.equinox.launcher\1.3.100.v20150511-1540\org.eclipse.equinox.launcher-1.3.100.v20150511-1540.jar, -install, C:\Users\10520312\workspace\target-definition-dsl-example\target\eclipserun-work, -configuration, C:\Users\10520312\workspace\target-definition-dsl-example\target\eclipserun-work\configuration, -consoleLog, -application, fr.obeo.releng.targetplatform.targetPlatform.converter, eclipse-mars.tpd]
>> Fetching p2.index from http://www-artifactory:8081/artifactory/eclipse-mars-releases/ (0B of 128B at 0B/s)
>> Fetching p2.index from http://www-artifactory:8081/artifactory/eclipse-mars-releases/ (128B of 128B at 0B/s)
>> 1 operation remaining.
>> Fetching compositeContent.jar from http://www-artifactory:8081/artifactory/eclipse-mars-releases/ (0B of 502B at 0B/s)
>> 1 operation remaining.
>> Fetching compositeContent.jar from http://www-artifactory:8081/artifactory/eclipse-mars-releases/ (502B of 502B at 0B/s)
!SESSION 2017-01-16 09:31:30.389 -----------------------------------------------
eclipse.buildId=unknown
java.version=1.8.0_72
java.vendor=Oracle Corporation
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=de_DE
Framework arguments: -application fr.obeo.releng.targetplatform.targetPlatform.converter eclipse-mars.tpd
Command-line arguments: -consoleLog -application fr.obeo.releng.targetplatform.targetPlatform.converter eclipse-mars.tpd
!ENTRY org.eclipse.equinox.p2.transport.ecf 2 0 2017-01-16 09:31:56.049
!MESSAGE Connection to http://download.eclipse.org/technology/epp/packages/mars/p2.index failed on Connection timed out: connect. Retry attempt 0 started
!STACK 0
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.doValidateContents(Diagnostician.java:185)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:161)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.doValidateContents(Diagnostician.java:185)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:161)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:108)
at fr.obeo.releng.targetplatform.pde.Converter.doGenerateTargetDefinitionFile(Converter.java:106)
at fr.obeo.releng.targetplatform.pde.Converter.generateTargetDefinitionFile(Converter.java:74)
at fr.obeo.releng.targetplatform.pde.ConverterApplication.start(ConverterApplication.java:59)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:380)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:669)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:608)
at org.eclipse.equinox.launcher.Main.run(Main.java:1515)
at org.eclipse.equinox.launcher.Main.main(Main.java:1488)
Caused by: java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientFileSystemBrowser.runRequest(HttpClientFileSystemBrowser.java:259)
at org.eclipse.ecf.provider.filetransfer.browse.AbstractFileSystemBrowser$DirectoryJob.run(AbstractFileSystemBrowser.java:69)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
!SUBENTRY 1 org.eclipse.equinox.p2.transport.ecf 4 1002 2017-01-16 10:03:56.928
!MESSAGE Unable to connect to repository http://download.eclipse.org/technology/epp/packages/mars/content.xml
!STACK 0
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientFileSystemBrowser.runRequest(HttpClientFileSystemBrowser.java:259)
at org.eclipse.ecf.provider.filetransfer.browse.AbstractFileSystemBrowser$DirectoryJob.run(AbstractFileSystemBrowser.java:69)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
!ENTRY org.eclipse.equinox.p2.transport.ecf 2 0 2017-01-16 10:04:18.107
!MESSAGE Connection to http://download.eclipse.org/technology/epp/packages/mars/p2.index failed on Connection timed out: connect. Retry attempt 0 started
!STACK 0
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.eclipse.ecf.internal.provider.filetransfer.httpclient4.ECFHttpClientProtocolSocketFactory.connectSocket(ECFHttpClientProtocolSocketFactory.java:86)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.AbstractPoolEntry.open(AbstractPoolEntry.java:144)
at org.apache.http.impl.conn.AbstractPooledConnAdapter.open(AbstractPooledConnAdapter.java:131)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientRetrieveFileTransfer.performConnect(HttpClientRetrieveFileTransfer.java:1077)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientRetrieveFileTransfer.access$0(HttpClientRetrieveFileTransfer.java:1068)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientRetrieveFileTransfer$1.performFileTransfer(HttpClientRetrieveFileTransfer.java:1064)
at org.eclipse.ecf.filetransfer.FileTransferJob.run(FileTransferJob.java:73)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
!ENTRY org.eclipse.equinox.p2.core 4 0 2017-01-16 10:05:21.362
!MESSAGE Provisioning exception
!STACK 1
org.eclipse.equinox.p2.core.ProvisionException: Unable to connect to repository http://download.eclipse.org/technology/epp/packages/mars/content.xml
at org.eclipse.equinox.internal.p2.repository.CacheManager.createCache(CacheManager.java:243)
at org.eclipse.equinox.internal.p2.metadata.repository.SimpleMetadataRepositoryFactory.getLocalFile(SimpleMetadataRepositoryFactory.java:66)
at org.eclipse.equinox.internal.p2.metadata.repository.SimpleMetadataRepositoryFactory.load(SimpleMetadataRepositoryFactory.java:88)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.factoryLoad(MetadataRepositoryManager.java:57)
at org.eclipse.equinox.internal.p2.repository.helpers.AbstractRepositoryManager.loadRepository(AbstractRepositoryManager.java:768)
at org.eclipse.equinox.internal.p2.repository.helpers.AbstractRepositoryManager.loadRepository(AbstractRepositoryManager.java:668)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.loadRepository(MetadataRepositoryManager.java:96)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.loadRepository(MetadataRepositoryManager.java:92)
at org.eclipse.equinox.internal.p2.metadata.repository.CompositeMetadataRepository.addChild(CompositeMetadataRepository.java:166)
at org.eclipse.equinox.internal.p2.metadata.repository.CompositeMetadataRepository.<init>(CompositeMetadataRepository.java:106)
at org.eclipse.equinox.internal.p2.metadata.repository.CompositeMetadataRepositoryFactory.load(CompositeMetadataRepositoryFactory.java:122)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.factoryLoad(MetadataRepositoryManager.java:57)
at org.eclipse.equinox.internal.p2.repository.helpers.AbstractRepositoryManager.loadRepository(AbstractRepositoryManager.java:768)
at org.eclipse.equinox.internal.p2.repository.helpers.AbstractRepositoryManager.loadRepository(AbstractRepositoryManager.java:668)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.loadRepository(MetadataRepositoryManager.java:96)
at org.eclipse.equinox.internal.p2.metadata.repository.MetadataRepositoryManager.loadRepository(MetadataRepositoryManager.java:92)
at fr.obeo.releng.targetplatform.validation.TargetPlatformValidator.checkIUIDAndRangeInRepository(TargetPlatformValidator.java:807)
at sun.reflect.GeneratedMethodAccessor11.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.xtext.validation.AbstractDeclarativeValidator$MethodWrapper.invoke(AbstractDeclarativeValidator.java:118)
at org.eclipse.xtext.validation.AbstractDeclarativeValidator.internalValidate(AbstractDeclarativeValidator.java:312)
at org.eclipse.xtext.validation.AbstractInjectableValidator.validate(AbstractInjectableValidator.java:71)
at org.eclipse.xtext.validation.CompositeEValidator.validate(CompositeEValidator.java:151)
at org.eclipse.emf.ecore.util.Diagnostician.doValidate(Diagnostician.java:171)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:158)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.doValidateContents(Diagnostician.java:185)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:161)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.doValidateContents(Diagnostician.java:185)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:161)
at fr.obeo.releng.targetplatform.pde.Converter$CustomDiagnostician.validate(Converter.java:239)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:137)
at org.eclipse.emf.ecore.util.Diagnostician.validate(Diagnostician.java:108)
at fr.obeo.releng.targetplatform.pde.Converter.doGenerateTargetDefinitionFile(Converter.java:106)
at fr.obeo.releng.targetplatform.pde.Converter.generateTargetDefinitionFile(Converter.java:74)
at fr.obeo.releng.targetplatform.pde.ConverterApplication.start(ConverterApplication.java:59)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:380)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:669)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:608)
at org.eclipse.equinox.launcher.Main.run(Main.java:1515)
at org.eclipse.equinox.launcher.Main.main(Main.java:1488)
Caused by: java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientFileSystemBrowser.runRequest(HttpClientFileSystemBrowser.java:259)
at org.eclipse.ecf.provider.filetransfer.browse.AbstractFileSystemBrowser$DirectoryJob.run(AbstractFileSystemBrowser.java:69)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
!SUBENTRY 1 org.eclipse.equinox.p2.transport.ecf 4 1002 2017-01-16 10:05:21.364
!MESSAGE Unable to connect to repository http://download.eclipse.org/technology/epp/packages/mars/content.xml
!STACK 0
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.eclipse.ecf.provider.filetransfer.httpclient4.HttpClientFileSystemBrowser.runRequest(HttpClientFileSystemBrowser.java:259)
at org.eclipse.ecf.provider.filetransfer.browse.AbstractFileSystemBrowser$DirectoryJob.run(AbstractFileSystemBrowser.java:69)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
[ERROR] Unable to read repository at http://www-artifactory:8081/artifactory/eclipse-mars-releases/.
Problems occurred during generation of target platform definition file.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 34:02 min
[INFO] Finished at: 2017-01-16T10:05:22+01:00
[INFO] Final Memory: 21M/411M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.eclipse.tycho.extras:tycho-eclipserun-plugin:0.25.0:eclipse-run (default) on project target-definition-dsl-example: Error while executing platform: Error while executing platform (return code: -1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionExceptionat
compositeContent.xml
<?xml version='1.0' encoding='UTF-8'?>
<?compositeMetadataRepository version='1.0.0'?>
<repository name='Eclipse Mars repository' type='org.eclipse.equinox.internal.p2.metadata.repository.CompositeMetadataRepository' version='1.0.0'>
<properties size='3'>
<property name='p2.timestamp' value='1313779613760'/>
<property name='p2.atomic.composite.loading' value='true'/>
</properties>
<children size='4'>
<child location='http://download.eclipse.org/technology/epp/packages/mars/'/>
<child location='201506241002' />
<child location='201510021000' />
<child location='201602261000' />
</children>
</repository>
I think the best way to achive it is to create a virtual repository in artifactory and enter http://download.eclipse.org/releases/mars/ as remote URL. You can then use the virtual repository URL in your tycho configuration. This will create a new remote repository for download.eclipse.org under the hood but you won't need to use the remote repository.
See artifactory documentation for further info.
Command:
wcbdd#ubuntu:~/apache/oozie-4.1.0/distro/target/oozie-4.1.0/bin$ sudo -u oozie ./ooziedb.sh create -sqlfile oozie.sql –run
setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
Validate DB Connection
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/util/ReflectionUtils
at org.apache.oozie.service.Services.setServiceInternal(Services.java:374)
at org.apache.oozie.service.Services.<init>(Services.java:110)
at org.apache.oozie.tools.OozieDBCLI.getJdbcConf(OozieDBCLI.java:163)
at org.apache.oozie.tools.OozieDBCLI.createConnection(OozieDBCLI.java:845)
at org.apache.oozie.tools.OozieDBCLI.validateConnection(OozieDBCLI.java:853)
at org.apache.oozie.tools.OozieDBCLI.createDB(OozieDBCLI.java:181)
at org.apache.oozie.tools.OozieDBCLI.run(OozieDBCLI.java:125)
at org.apache.oozie.tools.OozieDBCLI.main(OozieDBCLI.java:76)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.util.ReflectionUtils
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 8 more
I just started integrating RHadoop. It is integrated R-studio server with Hadoop, but I am getting error while running map-reduce jobs. when I run following Line of code.
library(rmr2)
a <- to.dfs(seq(from=1, to=500, by=3), output="/user/hduser/num")
*b <- mapreduce(input=a, map=function(k,v){keyval(v,v*v)})*
StackTrace:
15/03/24 21:13:47 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
packageJobJar: [] [/usr/lib/hadoop-mapreduce/hadoop-streaming-2.5.0-cdh5.2.0.jar] /tmp/streamjob4788227373090541042.jar tmpDir=null
15/03/24 21:13:48 INFO client.RMProxy: Connecting to ResourceManager at tungsten10/192.168.0.123:8032
15/03/24 21:13:48 INFO client.RMProxy: Connecting to ResourceManager at tungsten10/192.168.0.123:8032
15/03/24 21:13:49 INFO mapred.FileInputFormat: Total input paths to process : 1
15/03/24 21:13:50 INFO mapreduce.JobSubmitter: number of splits:2
15/03/24 21:13:50 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1427104115974_0009
15/03/24 21:13:50 INFO impl.YarnClientImpl: Submitted application application_1427104115974_0009
15/03/24 21:13:50 INFO mapreduce.Job: The url to track the job: http://XXX.XXX.XXX.XXX:8088/proxy/application_1427104115974_0009/
15/03/24 21:13:50 INFO mapreduce.Job: Running job: job_1427104115974_0009
15/03/24 21:14:02 INFO mapreduce.Job: Job job_1427104115974_0009 running in uber mode : false
15/03/24 21:14:03 INFO mapreduce.Job: map 0% reduce 0%
15/03/24 21:14:07 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000000_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:08 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000001_0, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:15 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000001_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:16 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000000_1, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:20 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000001_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:21 INFO mapreduce.Job: Task Id : attempt_1427104115974_0009_m_000000_2, Status : FAILED
Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1
at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)
at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)
at org.apache.hadoop.streaming.PipeMapper.close(PipeMapper.java:130)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:61)
at org.apache.hadoop.streaming.PipeMapRunner.run(PipeMapRunner.java:34)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:450)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
15/03/24 21:14:25 INFO mapreduce.Job: map 100% reduce 0%
15/03/24 21:14:26 INFO mapreduce.Job: Job job_1427104115974_0009 failed with state FAILED due to: Task failed task_1427104115974_0009_m_000001
Job failed as tasks failed. failedMaps:1 failedReduces:0
15/03/24 21:14:26 INFO mapreduce.Job: Counters: 13
Job Counters
Failed map tasks=7
Killed map tasks=1
Launched map tasks=8
Other local map tasks=6
Data-local map tasks=2
Total time spent by all maps in occupied slots (ms)=27095
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=27095
Total vcore-seconds taken by all map tasks=27095
Total megabyte-seconds taken by all map tasks=27745280
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
15/03/24 21:14:26 ERROR streaming.StreamJob: Job not Successful!
Streaming Command Failed!
**Error in mr(map = map, reduce = reduce, combine = combine, vectorized.reduce, :
hadoop streaming failed with error code 1
15/03/24 21:14:30 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 1440 minutes, Emptier interval = 0 minutes.
Moved: 'hdfs://XXX.XXX.XXX.XXX:8020/tmp/file10076f272b9a' to trash at: hdfs://XXX.XXX.XXX.XXX:8020/user/hduser/.Trash/Current**
I searched a lot for solving this problem, but solution not found yet.
As I am new to RHadoop I am stucked with this problem.
Can, Anyone please help me to resolve this problem, I will be very much thankful.
The error is caused as the HADOOP_STREAMING environment variable is not set in your code. You should specify the full path along with the jar file name. The below R code seems to work fine for me.
R Code (I'm using hadoop 2.4.0 over Ubuntu)
Sys.setenv("HADOOP_CMD"="/usr/local/hadoop/bin/hadoop")
Sys.setenv("HADOOP_STREAMING"="/usr/local/hadoop/share/hadoop/tools/lib/hadoop-streaming-2.4.0.jar")
library(rJava)
library(rhdfs)
# Initialise
hdfs.init()
library(rmr2)
a <- to.dfs(seq(from=1, to=500, by=3), output="/user/hduser/num")
b <- mapreduce(input=a, map=function(k,v){keyval(v,v*v)})
Hope this helps.
I am trying to get page source of following URL using Html-Unit get method.
http://denydesigns.com/collections/barbara-sherman-fleece-throw-blanket/products/barbara-sherman-antique-fleece-throw-blanket
It is getting stuck somewhere. I am trying to find out the reason but I am not getting it.
I also tried to see if the Thread created by HtmlUnit is BLOCKED ar WAITING, but this is also not the case.
Following is my log generated by HTML Unit.
18 Jan 2013 04:14:47,832 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[The data necessary to complete this operation is not yet available.] sourceName=[http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.min.js] line=[16] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:47,924 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.html.HTMLDocument.jsxFunction_getElementById(HTMLDocument.java:1049) - getElementById(script1358500487923) did a getElementByName for Internet Explorer
18 Jan 2013 04:14:49,498 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[The data necessary to complete this operation is not yet available.] sourceName=[http://code.jquery.com/jquery-latest.js] line=[911] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:49,565 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.html.HTMLDocument.jsxFunction_getElementById(HTMLDocument.java:1049) - getElementById(sizzle-1358500489525) did a getElementByName for Internet Explorer
18 Jan 2013 04:14:53,047 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.ActiveXObject.jsConstructor(ActiveXObject.java:128) - Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash.7'.
18 Jan 2013 04:14:53,048 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash.7'.] sourceName=[http://www.google-analytics.com/ga.js] line=[18] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:53,060 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.ActiveXObject.jsConstructor(ActiveXObject.java:128) - Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash.6'.
18 Jan 2013 04:14:53,061 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash.6'.] sourceName=[http://www.google-analytics.com/ga.js] line=[18] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:53,061 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.ActiveXObject.jsConstructor(ActiveXObject.java:128) - Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash'.
18 Jan 2013 04:14:53,062 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[Automation server can't create object for 'ShockwaveFlash.ShockwaveFlash'.] sourceName=[http://www.google-analytics.com/ga.js] line=[18] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:53,829 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[The data necessary to complete this operation is not yet available.] sourceName=[http://chat.livechatinc.net/licence/1051689/script.cgi?lang=en&groups=0] line=[60] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:54,878 - main - ERROR - com.gargoylesoftware.htmlunit.javascript.StrictErrorReporter.runtimeError(StrictErrorReporter.java:79) - runtimeError: message=[The data necessary to complete this operation is not yet available.] sourceName=[http://platform.twitter.com/widgets.js] line=[5] lineSource=[null] lineOffset=[0]
18 Jan 2013 04:14:56,215 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.html.HTMLDocument.jsxFunction_getElementById(HTMLDocument.java:1049) - getElementById(sizzle-1358500496196) did a getElementByName for Internet Explorer
18 Jan 2013 04:14:56,458 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.html.HTMLDocument.jsxFunction_execCommand(HTMLDocument.java:1590) - Nothing done for execCommand(BackgroundImageCache, ...) (feature not implemented)
18 Jan 2013 04:14:58,086 - main - WARN - com.gargoylesoftware.htmlunit.javascript.host.html.HTMLDocument.jsxFunction_getElementById(HTMLDocument.java:1049) - getElementById(sizzle-1358500489525) did a getElementByName for Internet Explorer
And Following is my Thread Dump for the process created(Using jstack)
2013-01-18 04:17:46
Full thread dump Java HotSpot(TM) 64-Bit Server VM (22.1-b02 mixed mode):
"Attach Listener" daemon prio=10 tid=0x0000000002955000 nid=0x16dd waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"Service Thread" daemon prio=10 tid=0x00007feca00cc800 nid=0x154f runnable [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C2 CompilerThread1" daemon prio=10 tid=0x00007feca00ca000 nid=0x154e waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"C2 CompilerThread0" daemon prio=10 tid=0x00007feca00c7000 nid=0x154d waiting on condition [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"Signal Dispatcher" daemon prio=10 tid=0x00007feca00c5000 nid=0x154c runnable [0x0000000000000000]
java.lang.Thread.State: RUNNABLE
"Finalizer" daemon prio=10 tid=0x00007feca007c800 nid=0x154b in Object.wait() [0x00007fec9fffe000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x00000000c2369e20> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:135)
- locked <0x00000000c2369e20> (a java.lang.ref.ReferenceQueue$Lock)
at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:151)
at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:177)
"Reference Handler" daemon prio=10 tid=0x00007feca007a000 nid=0x154a in Object.wait() [0x00007feca4157000]
java.lang.Thread.State: WAITING (on object monitor)
at java.lang.Object.wait(Native Method)
- waiting on <0x00000000c23699e0> (a java.lang.ref.Reference$Lock)
at java.lang.Object.wait(Object.java:503)
at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:133)
- locked <0x00000000c23699e0> (a java.lang.ref.Reference$Lock)
"main" prio=10 tid=0x00000000025d9000 nid=0x1546 runnable [0x00007fecaa8b6000]
java.lang.Thread.State: RUNNABLE
at net.sourceforge.htmlunit.corejs.javascript.ScriptableObject.getTopLevelScope(ScriptableObject.java:2007)
at com.gargoylesoftware.htmlunit.javascript.SimpleScriptable.getWindow(SimpleScriptable.java:303)
at com.gargoylesoftware.htmlunit.javascript.SimpleScriptable.getWindow(SimpleScriptable.java:293)
at com.gargoylesoftware.htmlunit.javascript.SimpleScriptable.getPrototype(SimpleScriptable.java:251)
at com.gargoylesoftware.htmlunit.javascript.host.html.HTMLCollection.<init>(HTMLCollection.java:99)
at com.gargoylesoftware.htmlunit.javascript.host.html.HTMLCollection.<init>(HTMLCollection.java:110)
at com.gargoylesoftware.htmlunit.javascript.host.HTMLCollectionFrames.<init>(Window.java:1751)
at com.gargoylesoftware.htmlunit.javascript.host.Window.getFrames(Window.java:759)
at com.gargoylesoftware.htmlunit.javascript.host.Window.jsxGet_length(Window.java:749)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at net.sourceforge.htmlunit.corejs.javascript.MemberBox.invoke(MemberBox.java:172)
at net.sourceforge.htmlunit.corejs.javascript.ScriptableObject$GetterSlot.getValue(ScriptableObject.java:342)
at net.sourceforge.htmlunit.corejs.javascript.ScriptableObject.getImpl(ScriptableObject.java:2523)
at net.sourceforge.htmlunit.corejs.javascript.ScriptableObject.get(ScriptableObject.java:438)
at com.gargoylesoftware.htmlunit.javascript.SimpleScriptable.get(SimpleScriptable.java:75)
at com.gargoylesoftware.htmlunit.javascript.host.Window.get(Window.java:1226)
at net.sourceforge.htmlunit.corejs.javascript.ScriptableObject.getProperty(ScriptableObject.java:2088)
at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.getObjectProp(ScriptRuntime.java:1527)
at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.getObjectProp(ScriptRuntime.java:1513)
at net.sourceforge.htmlunit.corejs.javascript.Interpreter.interpretLoop(Interpreter.java:1398)
at net.sourceforge.htmlunit.corejs.javascript.Interpreter.interpret(Interpreter.java:854)
at net.sourceforge.htmlunit.corejs.javascript.InterpretedFunction.call(InterpretedFunction.java:164)
at net.sourceforge.htmlunit.corejs.javascript.ContextFactory.doTopCall(ContextFactory.java:429)
at com.gargoylesoftware.htmlunit.javascript.HtmlUnitContextFactory.doTopCall(HtmlUnitContextFactory.java:267)
at net.sourceforge.htmlunit.corejs.javascript.ScriptRuntime.doTopCall(ScriptRuntime.java:3183)
at net.sourceforge.htmlunit.corejs.javascript.InterpretedFunction.call(InterpretedFunction.java:162)
at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine$4.doRun(JavaScriptEngine.java:538)
at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine$HtmlUnitContextAction.run(JavaScriptEngine.java:589)
- locked <0x00000000c274d308> (a com.gargoylesoftware.htmlunit.html.HtmlPage)
at net.sourceforge.htmlunit.corejs.javascript.Context.call(Context.java:537)
at net.sourceforge.htmlunit.corejs.javascript.ContextFactory.call(ContextFactory.java:538)
at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine.callFunction(JavaScriptEngine.java:545)
at com.gargoylesoftware.htmlunit.javascript.JavaScriptEngine.callFunction(JavaScriptEngine.java:520)
at com.gargoylesoftware.htmlunit.html.HtmlPage.executeJavaScriptFunctionIfPossible(HtmlPage.java:896)
at com.gargoylesoftware.htmlunit.javascript.host.EventListenersContainer.executeEventListeners(EventListenersContainer.java:162)
at com.gargoylesoftware.htmlunit.javascript.host.EventListenersContainer.executeBubblingListeners(EventListenersContainer.java:221)
at com.gargoylesoftware.htmlunit.javascript.host.Node.fireEvent(Node.java:735)
at com.gargoylesoftware.htmlunit.html.HtmlElement$2.run(HtmlElement.java:866)
at net.sourceforge.htmlunit.corejs.javascript.Context.call(Context.java:537)
at net.sourceforge.htmlunit.corejs.javascript.ContextFactory.call(ContextFactory.java:538)
at com.gargoylesoftware.htmlunit.html.HtmlElement.fireEvent(HtmlElement.java:871)
at com.gargoylesoftware.htmlunit.html.HtmlPage.executeEventHandlersIfNeeded(HtmlPage.java:1162)
at com.gargoylesoftware.htmlunit.html.HtmlPage.initialize(HtmlPage.java:202)
at com.gargoylesoftware.htmlunit.WebClient.loadWebResponseInto(WebClient.java:440)
at com.gargoylesoftware.htmlunit.WebClient.getPage(WebClient.java:311)
at com.gargoylesoftware.htmlunit.WebClient.getPage(WebClient.java:389)
"VM Thread" prio=10 tid=0x00007feca0072800 nid=0x1549 runnable
"GC task thread#0 (ParallelGC)" prio=10 tid=0x00000000025e4000 nid=0x1547 runnable
"GC task thread#1 (ParallelGC)" prio=10 tid=0x00000000025e5800 nid=0x1548 runnable
"VM Periodic Task Thread" prio=10 tid=0x00007feca00d7800 nid=0x1550 waiting on condition
JNI global references: 317
I am not sure why URL is stuck.
It is not coming out of the method. Can any body please look into it.
UPDATE
com.gargoylesoftware.htmlunit.html.HTMLParser.HtmlUnitDOMBuilder.parse(XMLInputSource)
#Override
public void parse(final XMLInputSource inputSource) throws XNIException, IOException {
final HtmlUnitDOMBuilder oldBuilder = page_.getBuilder();
page_.setBuilder(this);
try {
super.parse(inputSource);
}
finally {
page_.setBuilder(oldBuilder);
}
}
I attached HtmlUnit source-code from HtmlUnit and Debugged. Above method is not executing completly.
Also, I have set timeout as follows:
webClient.setTimeout(120000);
So why does it not come out of it after 2 min and says SomeThingTimeOutException ?
I have followed up with HtmlUnit user group. They have resolved this issue in 2.12 version of HtmlUnit. Please check.