TDCH failing while importing to hive table - teradata

hadoop jar /usr/lib/tdch/1.4/lib/teradata-connector-1.4.4.jar com.teradata.connector.common.tool.ConnectorImportTool \
-url jdbc:teradata://192.168.2.128/DATABASE=db_1 \
-username dbc \
-password dbc \
-jobtype hive \
-fileformat textfile \
-sourcetable employee \
-nummappers 1 \
-targettable td_employee \
-targettableschema "emp_id int,firstname string,lastname string"
Here is the Log. I have added the hive-serde jar in HADOOP_CLASSPATH.
17/04/20 04:26:56 INFO tool.ConnectorImportTool: ConnectorImportTool starts at 1492687616920
17/04/20 04:26:58 INFO common.ConnectorPlugin: load plugins in jar:file:/usr/lib/tdch/1.4/lib/teradata-connector-1.4.4.jar!/teradata.connector.plugins.xml
17/04/20 04:26:59 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
17/04/20 04:26:59 INFO metastore.ObjectStore: ObjectStore, initialize called
17/04/20 04:26:59 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
17/04/20 04:26:59 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
17/04/20 04:27:03 INFO metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
17/04/20 04:27:03 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "#" (64), after : "".
17/04/20 04:27:05 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
17/04/20 04:27:05 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
17/04/20 04:27:05 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-only" so does not have its own datastore table.
17/04/20 04:27:05 INFO DataNucleus.Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" so does not have its own datastore table.
17/04/20 04:27:05 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery#0" since the connection used is closing
17/04/20 04:27:05 INFO metastore.ObjectStore: Initialized ObjectStore
17/04/20 04:27:06 INFO metastore.HiveMetaStore: Added admin role in metastore
17/04/20 04:27:06 INFO metastore.HiveMetaStore: Added public role in metastore
17/04/20 04:27:06 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
17/04/20 04:27:06 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=td_employee
17/04/20 04:27:06 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=default tbl=td_employee
17/04/20 04:27:06 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at: 1492687626978
17/04/20 04:27:08 INFO utils.TeradataUtils: the input database product is Teradata
17/04/20 04:27:08 INFO utils.TeradataUtils: the input database version is 16.0
17/04/20 04:27:08 INFO utils.TeradataUtils: the jdbc driver version is 15.0
17/04/20 04:27:08 INFO processor.TeradataInputProcessor: the teradata connector for hadoop version is: 1.4.4
17/04/20 04:27:08 INFO processor.TeradataInputProcessor: input jdbc properties are jdbc:teradata://192.168.2.128/DATABASE=db_1
17/04/20 04:27:09 INFO processor.TeradataInputProcessor: the number of mappers are 1
17/04/20 04:27:09 INFO processor.TeradataInputProcessor: input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at: 1492687629069
17/04/20 04:27:09 INFO processor.TeradataInputProcessor: the total elapsed time of input preprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 2s
17/04/20 04:27:10 INFO Configuration.deprecation: mapred.output.dir is deprecated. Instead, use mapreduce.output.fileoutputformat.outputdir
17/04/20 04:27:10 INFO metastore.HiveMetaStore: No user is added in admin role, since config is empty
17/04/20 04:27:10 INFO metastore.HiveMetaStore: 0: get_table : db=default tbl=td_employee
17/04/20 04:27:10 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=get_table : db=default tbl=td_employee
17/04/20 04:27:10 INFO metastore.HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
17/04/20 04:27:10 INFO metastore.ObjectStore: ObjectStore, initialize called
17/04/20 04:27:10 INFO metastore.MetaStoreDirectSql: MySQL check failed, assuming we are not on mysql: Lexical error at line 1, column 5. Encountered: "#" (64), after : "".
17/04/20 04:27:10 INFO DataNucleus.Query: Reading in results for query "org.datanucleus.store.rdbms.query.SQLQuery#0" since the connection used is closing
17/04/20 04:27:10 INFO metastore.ObjectStore: Initialized ObjectStore
17/04/20 04:27:10 INFO processor.HiveOutputProcessor: hive table default.td_employee does not exist
17/04/20 04:27:10 INFO metastore.HiveMetaStore: 0: Shutting down the object store...
17/04/20 04:27:10 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Shutting down the object store...
17/04/20 04:27:10 INFO metastore.HiveMetaStore: 0: Metastore shutdown complete.
17/04/20 04:27:10 INFO HiveMetaStore.audit: ugi=root ip=unknown-ip-addr cmd=Metastore shutdown complete.
17/04/20 04:27:11 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050
17/04/20 04:27:13 INFO client.RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050
17/04/20 04:27:13 WARN mapred.ResourceMgrDelegate: getBlacklistedTrackers - Not implemented yet
17/04/20 04:27:13 INFO mapreduce.JobSubmitter: number of splits:1
17/04/20 04:27:14 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1492661647325_0001
17/04/20 04:27:14 INFO impl.YarnClientImpl: Submitted application application_1492661647325_0001
17/04/20 04:27:14 INFO mapreduce.Job: The url to track the job: http://sandbox.hortonworks.com:8088/proxy/application_1492661647325_0001/
17/04/20 04:27:14 INFO mapreduce.Job: Running job: job_1492661647325_0001
17/04/20 04:27:34 INFO mapreduce.Job: Job job_1492661647325_0001 running in uber mode : false
17/04/20 04:27:34 INFO mapreduce.Job: map 0% reduce 0%
17/04/20 04:27:49 INFO mapreduce.Job: Task Id : attempt_1492661647325_0001_m_000000_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.apache.hadoop.hive.serde2.SerDeException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.<init>(ConnectorOutputFormat.java:91)
at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
17/04/20 04:28:00 INFO mapreduce.Job: Task Id : attempt_1492661647325_0001_m_000000_1, Status : FAILED
Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/root/temp_042710/part-m-00000 for client 10.0.2.15 already exists
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2309)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2237)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2190)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:520)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:354)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1604)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1465)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1390)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:394)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:390)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:390)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:334)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784)
at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:132)
at com.teradata.connector.hive.HiveTextFileOutputFormat.getRecordWriter(HiveTextFileOutputFormat.java:22)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.<init>(ConnectorOutputFormat.java:89)
at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException): /user/root/temp_042710/part-m-00000 for client 10.0.2.15 already exists
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2309)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2237)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2190)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:520)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:354)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy15.create(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
at com.sun.proxy.$Proxy15.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:258)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1600)
... 22 more
17/04/20 04:28:05 INFO mapreduce.Job: Task Id : attempt_1492661647325_0001_m_000000_2, Status : FAILED
Error: org.apache.hadoop.fs.FileAlreadyExistsException: /user/root/temp_042710/part-m-00000 for client 10.0.2.15 already exists
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2309)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2237)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2190)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:520)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:354)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1604)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1465)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1390)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:394)
at org.apache.hadoop.hdfs.DistributedFileSystem$6.doCall(DistributedFileSystem.java:390)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:390)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:334)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:906)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:887)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:784)
at org.apache.hadoop.mapreduce.lib.output.TextOutputFormat.getRecordWriter(TextOutputFormat.java:132)
at com.teradata.connector.hive.HiveTextFileOutputFormat.getRecordWriter(HiveTextFileOutputFormat.java:22)
at com.teradata.connector.common.ConnectorOutputFormat$ConnectorFileRecordWriter.<init>(ConnectorOutputFormat.java:89)
at com.teradata.connector.common.ConnectorOutputFormat.getRecordWriter(ConnectorOutputFormat.java:38)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:624)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:744)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:340)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.fs.FileAlreadyExistsException): /user/root/temp_042710/part-m-00000 for client 10.0.2.15 already exists
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInternal(FSNamesystem.java:2309)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFileInt(FSNamesystem.java:2237)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.startFile(FSNamesystem.java:2190)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.create(NameNodeRpcServer.java:520)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.create(ClientNamenodeProtocolServerSideTranslatorPB.java:354)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2013)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2009)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1557)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2007)
at org.apache.hadoop.ipc.Client.call(Client.java:1410)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy15.create(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
at com.sun.proxy.$Proxy15.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:258)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1600)
... 22 more
17/04/20 04:28:13 INFO mapreduce.Job: map 100% reduce 0%
17/04/20 04:28:14 INFO mapreduce.Job: Job job_1492661647325_0001 failed with state FAILED due to: Task failed task_1492661647325_0001_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0
17/04/20 04:28:14 INFO mapreduce.Job: Counters: 12
Job Counters
Failed map tasks=4
Launched map tasks=4
Other local map tasks=3
Data-local map tasks=1
Total time spent by all maps in occupied slots (ms)=30868
Total time spent by all reduces in occupied slots (ms)=0
Total time spent by all map tasks (ms)=30868
Total vcore-seconds taken by all map tasks=30868
Total megabyte-seconds taken by all map tasks=7717000
Map-Reduce Framework
CPU time spent (ms)=0
Physical memory (bytes) snapshot=0
Virtual memory (bytes) snapshot=0
17/04/20 04:28:14 WARN tool.ConnectorJobRunner: com.teradata.connector.common.exception.ConnectorException: The output post processor returns 1
17/04/20 04:28:14 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor starts at: 1492687694783
17/04/20 04:28:15 INFO processor.TeradataInputProcessor: input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor ends at: 1492687694783
17/04/20 04:28:15 INFO processor.TeradataInputProcessor: the total elapsed time of input postprocessor com.teradata.connector.teradata.processor.TeradataSplitByHashProcessor is: 0s
17/04/20 04:28:15 INFO tool.ConnectorImportTool: ConnectorImportTool ends at 1492687695150
17/04/20 04:28:15 INFO tool.ConnectorImportTool: ConnectorImportTool time is 78s
17/04/20 04:28:15 INFO tool.ConnectorImportTool: job completed with exit code 1

TDCH import will fail if you are using lower versions of sqoop. Check the version compatibility.

Related

Corda service<init> unable to load org.apache.kafka.common.serialization.StringSerializer

I have a CordaService implemented that connects to the external apache kafka MQ(inside init block) using kafka-client library. Kafka client producer requires access to the org.apache.kafka.common.serialization.StringSerializer object which is defined in the kafka-client library.
I have included kafka-client library in build.gradle of the cordapp module where the service is defined as below:
compile "org.apache.kafka:kafka-clients:2.0.1"
Upon corda node startup i get the below message:
[WARN ] 2019-08-30T07:04:40,551Z [main] internal.Node.installCordaService - com.example.flow.ProducerService is using legacy CordaService constructor with ServiceHub parameter. Upgrade to an AppServiceHub parameter to enable updated API features.
[ERROR] 2019-08-30T07:04:40,716Z [main] internal.Node.installCordaServices - Unable to install Corda service com.example.flow.ProducerService [errorCode=1aep02i, moreInformationAt=https://errors.corda.net/OS/4.0/1aep02i]
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:1.8.0_181]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source) ~[?:1.8.0_181]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source) ~[?:1.8.0_181]
at java.lang.reflect.Constructor.newInstance(Unknown Source) ~[?:1.8.0_181]
at net.corda.node.internal.AbstractNode.installCordaService(AbstractNode.kt:654) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.AbstractNode.installCordaServices(AbstractNode.kt:577) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.AbstractNode.access$installCordaServices(AbstractNode.kt:120) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.AbstractNode$start$7.invoke(AbstractNode.kt:382) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.AbstractNode$start$7.invoke(AbstractNode.kt:120) ~[corda-node-4.0.jar:?]
at net.corda.nodeapi.internal.persistence.CordaPersistence.inTopLevelTransaction(CordaPersistence.kt:236) ~[corda-node-api-4.0.jar:?]
at net.corda.nodeapi.internal.persistence.CordaPersistence.transaction(CordaPersistence.kt:221) ~[corda-node-api-4.0.jar:?]
at net.corda.nodeapi.internal.persistence.CordaPersistence.transaction(CordaPersistence.kt:199) ~[corda-node-api-4.0.jar:?]
at net.corda.nodeapi.internal.persistence.CordaPersistence.transaction(CordaPersistence.kt:205) ~[corda-node-api-4.0.jar:?]
at net.corda.node.internal.AbstractNode.start(AbstractNode.kt:371) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.Node.start(Node.kt:419) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartup.startNode(NodeStartup.kt:185) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartupCli$runProgram$2.run(NodeStartup.kt:110) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartup$initialiseAndRun$5.invoke(NodeStartup.kt:162) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartup$initialiseAndRun$5.invoke(NodeStartup.kt:117) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartupLogging$DefaultImpls.attempt(NodeStartup.kt:450) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartup.attempt(NodeStartup.kt:117) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartup.initialiseAndRun(NodeStartup.kt:160) ~[corda-node-4.0.jar:?]
at net.corda.node.internal.NodeStartupCli.runProgram(NodeStartup.kt:108) ~[corda-node-4.0.jar:?]
at net.corda.cliutils.CordaCliWrapper.call(CordaCliWrapper.kt:184) ~[corda-tools-cliutils-4.0.jar:?]
at net.corda.cliutils.CordaCliWrapper.call(CordaCliWrapper.kt:152) ~[corda-tools-cliutils-4.0.jar:?]
at picocli.CommandLine.execute(CommandLine.java:1056) ~[picocli-3.8.0.jar:3.8.0]
at picocli.CommandLine.access$900(CommandLine.java:142) ~[picocli-3.8.0.jar:3.8.0]
at picocli.CommandLine$RunLast.handle(CommandLine.java:1246) ~[picocli-3.8.0.jar:3.8.0]
at picocli.CommandLine$RunLast.handle(CommandLine.java:1214) ~[picocli-3.8.0.jar:3.8.0]
at picocli.CommandLine$AbstractParseResultHandler.handleParseResult(CommandLine.java:1122) ~[picocli-3.8.0.jar:3.8.0]
at picocli.CommandLine.parseWithHandlers(CommandLine.java:1405) ~[picocli-3.8.0.jar:3.8.0]
at net.corda.cliutils.CordaCliWrapperKt.start(CordaCliWrapper.kt:72) ~[corda-tools-cliutils-4.0.jar:?]
at net.corda.node.Corda.main(Corda.kt:13) ~[corda-node-4.0.jar:?]
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value org.apache.kafka.common.serialization.StringSerializer for configuration key.serializer: Class org.apache.kafka.common.serialization.StringSerializer could not be found.
at org.apache.kafka.common.config.ConfigDef.parseType(ConfigDef.java:724) ~[?:?]
at org.apache.kafka.common.config.ConfigDef.parseValue(ConfigDef.java:469) ~[?:?]
at org.apache.kafka.common.config.ConfigDef.parse(ConfigDef.java:462) ~[?:?]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:62) ~[?:?]
at org.apache.kafka.common.config.AbstractConfig.<init>(AbstractConfig.java:75) ~[?:?]
at org.apache.kafka.clients.producer.ProducerConfig.<init>(ProducerConfig.java:364) ~[?:?]
at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:304) ~[?:?]
at com.example.flow.Producer.<init>(Service.kt:26) ~[?:?]
at com.example.flow.Ping.<init>(Service.kt:49) ~[?:?]
at com.example.flow.ProducerService.<init>(Service.kt:21) ~[?:?]
... 33 more
[INFO ] 2019-08-30T07:04:41,863Z [main] statemachine.SingleThreadedStateMachineManager.invoke - Node ready, info: NodeInfo(addresses=[localhost:10004], legalIdentitiesAndCerts=[O=PartyA, L=London, C=GB], platformVersion=4, serial=1567146729127)
[INFO ] 2019-08-30T07:04:42,001Z [Node thread-1] internal.Node.registerJmxReporter - Registering JMX reporter:
[INFO ] 2019-08-30T07:04:42,003Z [main] BasicInfo.printBasicNodeInfo - Loaded 2 CorDapp(s) : Contract CorDapp: CorDapp Example version 1 by vendor Corda Open Source with licence Apache License, Version 2.0, Workflow CorDapp: CorDapp Example version 1 by vendor Corda Open Source with licence Apache License, Version 2.0
[INFO ] 2019-08-30T07:04:42,004Z [Node thread-1] internal.Node.registerJolokiaReporter - Registering Jolokia JMX reporter:
[INFO ] 2019-08-30T07:04:42,009Z [main] BasicInfo.printBasicNodeInfo - Node for "PartyA" started up and registered in 51.24 sec
[INFO ] 2019-08-30T07:04:42,012Z [main] rpc.RPCServer.start - Starting RPC server with configuration RPCServerConfiguration(rpcThreadPoolSize=4, reapInterval=PT1S, deduplicationCacheExpiry=PT24H)
What could be the reason why corda node is unable to load the org.apache.kafka.common.serialization.StringSerializer class?
Resolved. Had to add `Thread.currentThread().setContextClassLoader(null); since StringSerializer class is by default loaded by application class loader

Artifactory 5.11 fails to start up due to Access failure

My artifactory instance has been running for months but it has completely stopped working since the last update (to 5.11).
When I try to start it up I see the following errors in the logs...
/\
/ \ ___ ___ ___ ___ ___
/ /\ \ / __/ __/ _ \/ __/ __|
/ ____ \ (_| (_| __/\__ \__ \
/_/ \_\___\___\___||___/___/
Access Version: 3.3.2
Access Revision: 30302900
2018-05-15 08:57:43.904 INFO 32571 --- [ost-startStop-1] o.j.a.s.startup.AccessHomeFinderImpl : Using JFrog Access home at '/var/opt/jfrog/artifactory/access' resolved from: System property (Artifactory)
2018-05-15 08:57:44.132 ERROR 32571 --- [ost-startStop-1] o.s.boot.SpringApplication : Application startup failed
java.lang.IllegalArgumentException: Unknown value:
at org.jfrog.access.util.EnumUtils.lambda$fromValue$0(EnumUtils.java:30) ~[access-common-api-3.3.2.jar:na]
at java.util.Optional.orElseThrow(Optional.java:290) ~[na:1.8.0_161]
at org.jfrog.access.util.EnumUtils.fromValue(EnumUtils.java:70) ~[access-common-api-3.3.2.jar:na]
at org.jfrog.access.util.EnumUtils.fromValue(EnumUtils.java:30) ~[access-common-api-3.3.2.jar:na]
at org.jfrog.access.server.home.migration.EnvironmentVersion.fromVersionString(EnvironmentVersion.java:62) ~[access-server-core-3.3.2.jar:na]
at org.jfrog.access.server.home.migration.EnvironmentConfig.readEnvVersionFile(EnvironmentConfig.java:72) ~[access-server-core-3.3.2.jar:na]
at org.jfrog.access.server.home.migration.EnvironmentConfig.getConfigVersion(EnvironmentConfig.java:46) ~[access-server-core-3.3.2.jar:na]
at org.jfrog.access.migration.ConfigMigrationRunner.migrateIfNeeded(ConfigMigrationRunner.java:37) ~[access-common-core-3.3.2.jar:na]
at org.jfrog.access.server.startup.AccessServerStartupFacadeImpl.migrateEnvironment(AccessServerStartupFacadeImpl.java:48) ~[access-server-core-3.3.2.jar:na]
at org.jfrog.access.server.startup.AccessServerStartupFacadeImpl.prepareEnvironment(AccessServerStartupFacadeImpl.java:30) ~[access-server-core-3.3.2.jar:na]
at org.jfrog.access.context.AccessApplicationContextInitializer.prepareEnvironment(AccessApplicationContextInitializer.java:48) ~[access-application-3.3.2.jar:3.3.2]
at org.jfrog.access.context.AccessApplicationContextInitializer.prepareEnvironment(AccessApplicationContextInitializer.java:26) ~[access-application-3.3.2.jar:3.3.2]
at org.jfrog.app.context.JFrogApplicationContextInitializer.initialize(JFrogApplicationContextInitializer.java:69) ~[jfrog-application-1.5.2.jar:na]
at org.springframework.boot.SpringApplication.applyInitializers(SpringApplication.java:567) ~[spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.boot.SpringApplication.prepareContext(SpringApplication.java:338) ~[spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:301) ~[spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.boot.web.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:151) [spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.boot.web.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:131) [spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.boot.web.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:86) [spring-boot-1.5.6.RELEASE.jar:1.5.6.RELEASE]
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:169) [spring-web-4.3.10.RELEASE.jar:4.3.10.RELEASE]
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5196) [catalina.jar:8.5.23]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150) [catalina.jar:8.5.23]
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:752) [catalina.jar:8.5.23]
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:728) [catalina.jar:8.5.23]
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:734) [catalina.jar:8.5.23]
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:630) [catalina.jar:8.5.23]
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1842) [catalina.jar:8.5.23]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_161]
at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_161]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [na:1.8.0_161]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [na:1.8.0_161]
at java.lang.Thread.run(Thread.java:748) [na:1.8.0_161]
May 15, 2018 8:57:44 AM org.apache.catalina.core.ContainerBase addChildInternal
SEVERE: ContainerBase.addChild: start:
org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/access]]
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:167)
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:752)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:728)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:734)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:630)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1842)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.IllegalArgumentException: Unknown value:
at org.jfrog.access.util.EnumUtils.lambda$fromValue$0(EnumUtils.java:30)
at java.util.Optional.orElseThrow(Optional.java:290)
at org.jfrog.access.util.EnumUtils.fromValue(EnumUtils.java:70)
at org.jfrog.access.util.EnumUtils.fromValue(EnumUtils.java:30)
at org.jfrog.access.server.home.migration.EnvironmentVersion.fromVersionString(EnvironmentVersion.java:62)
at org.jfrog.access.server.home.migration.EnvironmentConfig.readEnvVersionFile(EnvironmentConfig.java:72)
at org.jfrog.access.server.home.migration.EnvironmentConfig.getConfigVersion(EnvironmentConfig.java:46)
at org.jfrog.access.migration.ConfigMigrationRunner.migrateIfNeeded(ConfigMigrationRunner.java:37)
at org.jfrog.access.server.startup.AccessServerStartupFacadeImpl.migrateEnvironment(AccessServerStartupFacadeImpl.java:48)
at org.jfrog.access.server.startup.AccessServerStartupFacadeImpl.prepareEnvironment(AccessServerStartupFacadeImpl.java:30)
at org.jfrog.access.context.AccessApplicationContextInitializer.prepareEnvironment(AccessApplicationContextInitializer.java:48)
at org.jfrog.access.context.AccessApplicationContextInitializer.prepareEnvironment(AccessApplicationContextInitializer.java:26)
at org.jfrog.app.context.JFrogApplicationContextInitializer.initialize(JFrogApplicationContextInitializer.java:69)
at org.springframework.boot.SpringApplication.applyInitializers(SpringApplication.java:567)
at org.springframework.boot.SpringApplication.prepareContext(SpringApplication.java:338)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:301)
at org.springframework.boot.web.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:151)
at org.springframework.boot.web.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:131)
at org.springframework.boot.web.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:86)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:169)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5196)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 10 more
May 15, 2018 8:57:44 AM org.apache.catalina.startup.HostConfig deployDescriptor
SEVERE: Error deploying configuration descriptor [/opt/jfrog/artifactory/tomcat/conf/Catalina/localhost/access.xml]
java.lang.IllegalStateException: ContainerBase.addChild: start: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Catalina].StandardHost[localhost].StandardContext[/access]]
at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:756)
at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:728)
at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:734)
at org.apache.catalina.startup.HostConfig.deployDescriptor(HostConfig.java:630)
at org.apache.catalina.startup.HostConfig$DeployDescriptor.run(HostConfig.java:1842)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
May 15, 2018 8:57:44 AM org.apache.catalina.startup.HostConfig deployDescriptor
INFO: Deployment of configuration descriptor [/opt/jfrog/artifactory/tomcat/conf/Catalina/localhost/access.xml] has finished in [6,262] ms
2018-05-15 08:57:44 [ARTIFACTORY] [INFO ] Starting Artifactory [artifactory.home=/var/opt/jfrog/artifactory].
2018-05-15 08:57:44,833 [art-init] [INFO ] (o.a.w.s.ArtifactoryContextConfigListener:265) -
This then causes Artifactory to fail because it cant find the access server
2018-05-15 08:58:03,475 [art-init] [INFO ] (o.a.s.a.ArtifactoryAccessClientConfigStore:556) - Using Access Server URL: http://localhost:8040/access (bundled) source: detected
2018-05-15 08:58:03,959 [art-init] [INFO ] (o.a.s.a.AccessServiceImpl:290) - Waiting for access server...
2018-05-15 08:58:04,678 [art-init] [WARN ] (o.j.a.c.AccessClientHttpException:39) - Unrecognized ErrorsModel by Access. Original message: Failed on executing /api/v1/system/ping, with response: Not Found
2018-05-15 08:58:06,697 [art-init] [WARN ] (o.j.a.c.AccessClientHttpException:39) - Unrecognized ErrorsModel by Access. Original message: Failed on executing /api/v1/system/ping, with response: Not Found
...
2018-05-15 08:59:35,043 [art-init] [ERROR] (o.a.w.s.ArtifactoryContextConfigListener:92) - Application could not be initialized: Waiting for access server to respond timed-out
java.lang.reflect.InvocationTargetException: null
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.artifactory.webapp.servlet.ArtifactoryContextConfigListener.configure(ArtifactoryContextConfigListener.java:207)
at org.artifactory.webapp.servlet.ArtifactoryContextConfigListener.access$200(ArtifactoryContextConfigListener.java:63)
at org.artifactory.webapp.servlet.ArtifactoryContextConfigListener$1.run(ArtifactoryContextConfigListener.java:88)
Caused by: org.springframework.beans.factory.BeanInitializationException: Failed to initialize bean 'org.artifactory.security.access.AccessService'.; nested exception is java.lang.IllegalStateException: Waiting for access server to respond timed-out
at org.artifactory.spring.ArtifactoryApplicationContext.refresh(ArtifactoryApplicationContext.java:250)
at org.artifactory.spring.ArtifactoryApplicationContext.<init>(ArtifactoryApplicationContext.java:133)
... 7 common frames omitted
Caused by: java.lang.IllegalStateException: Waiting for access server to respond timed-out
at org.artifactory.security.access.AccessServiceImpl.waitForAccessServer(AccessServiceImpl.java:305)
at org.artifactory.security.access.AccessServiceImpl.initAccessService(AccessServiceImpl.java:265)
at org.artifactory.security.access.AccessServiceImpl.initIfNeeded(AccessServiceImpl.java:250)
at org.artifactory.security.access.AccessServiceImpl.init(AccessServiceImpl.java:244)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:99)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:281)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:96)
at org.artifactory.storage.fs.lock.aop.LockingAdvice.invoke(LockingAdvice.java:76)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy41.init(Unknown Source)
at org.artifactory.spring.ArtifactoryApplicationContext.refresh(ArtifactoryApplicationContext.java:248)
... 8 common frames omitted
2018-05-15 09:01:16,534 [http-nio-8081-exec-1] [ERROR] (o.a.w.s.ArtifactoryFilter:213) - Artifactory failed to initialize: Context is null
I have not been able to find anything about the Access error and none of the posts similar to the Artifactory error (like this https://www.jfrog.com/jira/browse/RTFACT-14477) have helped.
Any assistance would be greatly appreciated.
What version of Artifactory have you upgraded from? Is this a Zip installation?
Have you changed/removed any content in the $ARTIFACTORY_HOME/access/ folder? Specifically the $ARTIFACTORY_HOME/access/data/access.env.version.
Can you confirm this file is there the same it was prior to the upgrade?
Looks like it was touched by someone.
Can you change the content of the file to "4" (1byte, with no space or a blank line afterwards), then restart and see if there is any change.
To resolve this issue, please delete the db.lck + dbex.lck files from the $ARTIFACTORY_HOME/data/derby/ folder.

Could not open client transport with JDBC Connection refused in R

**> rhive.connect(host = "192.168.1.4",port = 9000,defaultFS = "hdfs://localhost:9000")**
***Warning:
+----------------------------------------------------------+
+ / hiveServer2 argument has not been provided correctly. +
+ / RHive will use a default value: hiveServer2=TRUE. +***
+----------------------------------------------------------+
16/08/14 14:12:42 INFO jdbc.Utils: Supplied authorities: 192.168.1.4:9000
16/08/14 14:12:42 INFO jdbc.Utils: Resolved authority: 192.168.1.4:9000
16/08/14 14:12:42 INFO jdbc.HiveConnection: Transport Used for JDBC connection: null
**Exception in thread "Thread-14" java.lang.RuntimeException: java.sql.SQLException: Could not open client transport with JDBC Uri:
jdbc:hive2://192.168.1.4:9000/default: java.net.ConnectException:
Connection refused*
*
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:337)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.run(HiveJdbcClient.java:322)
Caused by: java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.4:9000/default: java.net.ConnectException: Connection refused
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:208)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:154)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at com.nexr.rhive.hive.DatabaseConnection.connect(DatabaseConnection.java:51)
at com.nexr.rhive.hive.HiveJdbcClient$HiveJdbcConnector.connect(HiveJdbcClient.java:330)
... 1 more
Caused by: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused
at org.apache.thrift.transport.TSocket.open(TSocket.java:226)
at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:266)
at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:183)
... 7 more
Caused by: java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at enter code here java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.thrift.transport.TSocket.open(TSocket.java:221)
... 10 more
Error: java.lang.IllegalStateException: Not connected to hiveserver
You are not passing the password.
Try to pass the password as part of your script.
Null is returned in case you don't provide valid credentials.
Example:
jdbc:hive2://192.168.1.4:9000/default username password

How to generate SQL for Oracle 11g with Liquibase 3.3 in offline mode?

Problem
I use YAML scripts that describes my DB objects structure. It is very convenient because I can have DB-based integration tests with H2.
But the Production service of my client requires that we provide Oracle oriented SQL scripts (they use Liquibase, but in a not standard way, that we cannot challenge/change).
So I would like to generate these Oracle SQL scripts from my YAML ones, in an automatically way.
Thoughts
I firstly found the Liquibase:updateSQL command. Problem is that I do not have access to the DB (I don't know the JDBC URL, neither am I on the same network). So this solution can not work.
Then, I found a new option in Liquibase that allow "offline" mode for updateSQL command. It really seems to be the solution I'm looking for but then I have the following error (using Maven and -X -e options) :
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL (default-cli) on project granit: Error setting up or running Liquibase: liquibase.exception.UnexpectedLiquibaseException: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection() -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL (default-cli) on project granit: Error setting up or running Liquibase: liquibase.exception.UnexpectedLiquibaseException: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection()
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:216)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.MojoExecutionException: Error setting up or running Liquibase: liquibase.exception.UnexpectedLiquibaseException: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection()
at org.liquibase.maven.plugins.AbstractLiquibaseMojo.execute(AbstractLiquibaseMojo.java:373)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:133)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: liquibase.exception.DatabaseException: liquibase.exception.UnexpectedLiquibaseException: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection()
at liquibase.integration.commandline.CommandLineUtils.createDatabaseObject(CommandLineUtils.java:69)
at org.liquibase.maven.plugins.AbstractLiquibaseMojo.execute(AbstractLiquibaseMojo.java:321)
... 21 more
Caused by: liquibase.exception.UnexpectedLiquibaseException: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection()
at liquibase.database.core.OracleDatabase.setConnection(OracleDatabase.java:62)
at liquibase.database.DatabaseFactory.findCorrectDatabaseImplementation(DatabaseFactory.java:123)
at liquibase.database.DatabaseFactory.openDatabase(DatabaseFactory.java:143)
at liquibase.integration.commandline.CommandLineUtils.createDatabaseObject(CommandLineUtils.java:50)
... 22 more
Caused by: java.lang.NoSuchMethodException: liquibase.database.OfflineConnection.getWrappedConnection()
at java.lang.Class.getMethod(Class.java:1624)
at liquibase.database.core.OracleDatabase.setConnection(OracleDatabase.java:58)
... 25 more
I then re-try with offline H2 base but I've got a new error :
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.748 s
[INFO] Finished at: 2015-02-22T12:00:51+01:00
[INFO] Final Memory: 19M/123M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL (default-cli) on project app: Execution default-cli of goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL failed: A required class was missing while executing org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL: org/yaml/snakeyaml/Yaml
[ERROR] -----------------------------------------------------
[ERROR] realm = plugin>org.liquibase:liquibase-maven-plugin:3.3.2
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/Users/me/.m2/repository/org/liquibase/liquibase-maven-plugin/3.3.2/liquibase-maven-plugin-3.3.2.jar
[ERROR] urls[1] = file:/Users/me/.m2/repository/org/codehaus/plexus/plexus-utils/1.0.4/plexus-utils-1.0.4.jar
[ERROR] urls[2] = file:/Users/me/.m2/repository/org/liquibase/liquibase-core/3.3.2/liquibase-core-3.3.2.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[project>fr.cnp.grn:app:1.0.14-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
[ERROR]
[ERROR] -----------------------------------------------------: org.yaml.snakeyaml.Yaml
[ERROR] -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL (default-cli) on project app: Execution default-cli of goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL failed: A required class was missing while executing org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL: org/yaml/snakeyaml/Yaml
-----------------------------------------------------
realm = plugin>org.liquibase:liquibase-maven-plugin:3.3.2
strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
urls[0] = file:/Users/me/.m2/repository/org/liquibase/liquibase-maven-plugin/3.3.2/liquibase-maven-plugin-3.3.2.jar
urls[1] = file:/Users/me/.m2/repository/org/codehaus/plexus/plexus-utils/1.0.4/plexus-utils-1.0.4.jar
urls[2] = file:/Users/me/.m2/repository/org/liquibase/liquibase-core/3.3.2/liquibase-core-3.3.2.jar
Number of foreign imports: 1
import: Entry[import from realm ClassRealm[project>fr.cnp.grn:app:1.0.14-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
-----------------------------------------------------
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:224)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:108)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:76)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:51)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:116)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:361)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:155)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:584)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:213)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:157)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:289)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:229)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
Caused by: org.apache.maven.plugin.PluginExecutionException: Execution default-cli of goal org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL failed: A required class was missing while executing org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL: org/yaml/snakeyaml/Yaml
-----------------------------------------------------
realm = plugin>org.liquibase:liquibase-maven-plugin:3.3.2
strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
urls[0] = file:/Users/me/.m2/repository/org/liquibase/liquibase-maven-plugin/3.3.2/liquibase-maven-plugin-3.3.2.jar
urls[1] = file:/Users/me/.m2/repository/org/codehaus/plexus/plexus-utils/1.0.4/plexus-utils-1.0.4.jar
urls[2] = file:/Users/me/.m2/repository/org/liquibase/liquibase-core/3.3.2/liquibase-core-3.3.2.jar
Number of foreign imports: 1
import: Entry[import from realm ClassRealm[project>fr.cnp.grn:app:1.0.14-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
-----------------------------------------------------
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:167)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:208)
... 19 more
Caused by: org.apache.maven.plugin.PluginContainerException: A required class was missing while executing org.liquibase:liquibase-maven-plugin:3.3.2:updateSQL: org/yaml/snakeyaml/Yaml
-----------------------------------------------------
realm = plugin>org.liquibase:liquibase-maven-plugin:3.3.2
strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
urls[0] = file:/Users/me/.m2/repository/org/liquibase/liquibase-maven-plugin/3.3.2/liquibase-maven-plugin-3.3.2.jar
urls[1] = file:/Users/me/.m2/repository/org/codehaus/plexus/plexus-utils/1.0.4/plexus-utils-1.0.4.jar
urls[2] = file:/Users/me/.m2/repository/org/liquibase/liquibase-core/3.3.2/liquibase-core-3.3.2.jar
Number of foreign imports: 1
import: Entry[import from realm ClassRealm[project>fr.cnp.grn:app:1.0.14-SNAPSHOT, parent: ClassRealm[maven.api, parent: null]]]
-----------------------------------------------------
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:165)
... 20 more
Caused by: java.lang.NoClassDefFoundError: org/yaml/snakeyaml/Yaml
at liquibase.parser.core.yaml.YamlChangeLogParser.parse(YamlChangeLogParser.java:44)
at liquibase.changelog.DatabaseChangeLog.include(DatabaseChangeLog.java:356)
at liquibase.changelog.DatabaseChangeLog.handleChildNode(DatabaseChangeLog.java:248)
at liquibase.changelog.DatabaseChangeLog.load(DatabaseChangeLog.java:211)
at liquibase.parser.core.xml.AbstractChangeLogParser.parse(AbstractChangeLogParser.java:25)
at liquibase.Liquibase.getDatabaseChangeLog(Liquibase.java:215)
at liquibase.Liquibase.update(Liquibase.java:192)
at liquibase.Liquibase.update(Liquibase.java:258)
at org.liquibase.maven.plugins.LiquibaseUpdateSQL.doUpdate(LiquibaseUpdateSQL.java:49)
at org.liquibase.maven.plugins.AbstractLiquibaseUpdateMojo.performLiquibaseTask(AbstractLiquibaseUpdateMojo.java:24)
at org.liquibase.maven.plugins.AbstractLiquibaseMojo.execute(AbstractLiquibaseMojo.java:369)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:133)
... 20 more
Caused by: java.lang.ClassNotFoundException: org.yaml.snakeyaml.Yaml
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.unsynchronizedLoadClass(ClassRealm.java:259)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:235)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:227)
... 32 more
So my questions are :
Does the Liquibase:updateSQL seems to you the right solution ?
Do you succeed to execute Liquibase:updateSQL command in offline mode for Oracle ? (and how ?)
Thank you for reading so far

Publishing fails during 'committing deployment': cannot execute SQL Query

I've been bouncing my head against this issue for the last couple of hours now and I can't seem to come up with a solution. I'm trying to setup a Tridion Install SP1 HR1, with SQL database.
As far as I can tell, the http transport protocol should work, as changing anything related to this protocol still comes up with the same error.
As soon as a page (pretty simple one, too) hits 'committing deployment', the publish fails due to the following error:
2013-01-14 16:49:22,351 ERROR DeployPipelineExecutor - Original stacktrace for transaction: tcm:0-16-66560
com.tridion.deployer.ProcessingException: Unable to prepare transaction: tcm:0-16-66560, org.hibernate.exception.SQLGrammarException: could not execute query, org.hibernate.exception.SQLGrammarException: could not execute query
at com.tridion.deployer.phases.PreCommitPhase.handleFailure(PreCommitPhase.java:120) ~[cd_deployer.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.execute(PreCommitPhase.java:101) ~[cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.runMainExecutePhase(DeployPipelineExecutor.java:186) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.doExecute(DeployPipelineExecutor.java:97) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.execute(DeployPipelineExecutor.java:61) [cd_deployer.jar:na]
at com.tridion.deployer.TransactionManager.handleDeployPackage(TransactionManager.java:80) [cd_deployer.jar:na]
at com.tridion.deployer.queue.QueueLocationHandler$1.run(QueueLocationHandler.java:176) [cd_deployer.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.6.0_38]
at java.lang.Thread.run(Unknown Source) [na:1.6.0_38]
My storage_conf.xml looks like this
<Storages><Storage Type="persistence" Id="sqlserver" dialect="MSSQL" Class="com.tridion.storage.persistence.JPADAOFactory">
<Pool Type="jdbc" Size="5" MonitorInterval="60" IdleTimeout="120" CheckoutTimeout="120" />
<DataSource Class="com.microsoft.sqlserver.jdbc.SQLServerDataSource">
<Property Name="serverName" Value="T2011-DEV" />
<Property Name="portNumber" Value="1433" />
<Property Name="databaseName" Value="Tridion_cm" />
<Property Name="user" Value="secret" />
<Property Name="password" Value="secret" />
</DataSource>
</Storage>
<!--
Configuration example for using filesystem as data storage.
-->
<Storage Type="filesystem" Class="com.tridion.storage.filesystem.FSDAOFactory" Id="iisFile">
<Root Path="C:\inetpub\wwwroot\staging.dev" />
</Storage>
</Storages> </Global>
<!-- If no item type mappings are defined within ItemTypes or if storages on a lower level do not exist then the storage defined by defaultStorageId attribute will be used.
If storage defined by defaultStorageId does not exist then exception will be thrown. -->
<ItemTypes defaultStorageId="sqlserver" cached="false">
<Item typeMapping="Page" cached="false" storageId="iisFile"/>
<Item typeMapping="Binary" storageId="iisFile" cached="false"/>
</ItemTypes>
I've had some interesting issues with the Java install before, but got around that. I've even installed the .jar files from another machine that had a proper HTTP transport connection going and installed these over the ones I had, but that didn't work either. I also included the -assumingly- correct SQLDB4 -jar file. I have no clue what is causing this. The database connection seems to exist just fine as altering the logins into bogus returns a "no connection possible" or somesuch error.
any help would be fantastic.
EDIT
After Sea_gull said to enable root logging a lot of errors turned up. This is quite a list, but here are the most important parts:
013-01-14 19:28:01,578 DEBUG FSEntityManager - Starting transaction tcm:0-17-66560.
2013-01-14 19:28:01,579 INFO PreCommitPhase - Executing workers for transaction: tcm:0-17-66560 with 4 Workers
2013-01-14 19:28:01,579 DEBUG PreCommitPhase - Executing worker com.tridion.storage.deploy.workers.DynamicLinkInfoTrackingWorker#76612ef6 this is worker 1 of: 4
2013-01-14 19:28:01,580 DEBUG StorageManagerFactory - Loading a non cached DAO for publicationId/typeMapping/itemExtension: 12 / DynamicLinkInfo / null
2013-01-14 19:28:01,580 DEBUG DefaultListableBeanFactory - Returning cached instance of singleton bean 'sqlserverEntityManagerFactory'
2013-01-14 19:28:01,580 DEBUG SessionImpl - opened session at timestamp: 13581880815
2013-01-14 19:28:01,580 DEBUG JDBCTransaction - begin
2013-01-14 19:28:01,581 DEBUG ConnectionManager - opening JDBC connection
2013-01-14 19:28:01,581 DEBUG JDBCTransaction - current autocommit status: true
2013-01-14 19:28:01,581 DEBUG JDBCTransaction - disabling autocommit
2013-01-14 19:28:01,582 DEBUG DefaultListableBeanFactory - Creating instance of bean 'JPADynamicLinkDAO'
2013-01-14 19:28:01,583 DEBUG DefaultListableBeanFactory - Finished creating instance of bean 'JPADynamicLinkDAO'
2013-01-14 19:28:01,583 DEBUG JPADAOFactory - Loaded DAO with type: JPADynamicLinkDAO inside transaction: tcm:0-17-66560
2013-01-14 19:28:01,583 DEBUG StorageManagerFactory - Wrapping DAO's, currently 0 wrappers installed
2013-01-14 19:28:01,583 DEBUG JPADynamicLinkDAO - Storing dynamic links from TCMURI tcm:12-88-64
2013-01-14 19:28:01,583 DEBUG JPADynamicLinkDAO - Removing dynamic links with source TCMURI tcm:12-88-64
2013-01-14 19:28:01,584 DEBUG AbstractBatcher - about to open PreparedStatement (open PreparedStatements: 0, globally: 0)
2013-01-14 19:28:01,584 DEBUG SQL - delete from DYNAMIC_LINKS where SRC_PUB_ID=? and SRC_ITEM_ID=? and SRC_ITEM_TYPE=?
2013-01-14 19:28:01,585 DEBUG AbstractBatcher - about to close PreparedStatement (open PreparedStatements: 1, globally: 1)
2013-01-14 19:28:01,587 DEBUG JDBCExceptionReporter - could not execute update query [delete from DYNAMIC_LINKS where SRC_PUB_ID=? and SRC_ITEM_ID=? and SRC_ITEM_TYPE=?]
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'DYNAMIC_LINKS'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:404) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:350) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeUpdate(SQLServerPreparedStatement.java:314) ~[sqljdbc4.jar:na]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:102) ~[commons-dbcp.jar:1.2.2]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:102) ~[commons-dbcp.jar:1.2.2]
at org.hibernate.hql.ast.exec.BasicExecutor.execute(BasicExecutor.java:101) ~[hibernate-core.jar:3.3.2.GA]
at org.hibernate.hql.ast.QueryTranslatorImpl.executeUpdate(QueryTranslatorImpl.java:421) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.engine.query.HQLQueryPlan.performExecuteUpdate(HQLQueryPlan.java:283) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.impl.SessionImpl.executeUpdate(SessionImpl.java:1169) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.impl.QueryImpl.executeUpdate(QueryImpl.java:117) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.ejb.QueryImpl.executeUpdate(QueryImpl.java:51) [hibernate-entitymanager.jar:3.4.0.GA]
at com.tridion.storage.persistence.JPABaseDAO.executeQueryUpdate(JPABaseDAO.java:304) [cd_datalayer.jar:na]
at com.tridion.storage.persistence.JPADynamicLinkDAO.remove(JPADynamicLinkDAO.java:104) [cd_datalayer.jar:na]
at com.tridion.storage.persistence.JPADynamicLinkDAO.store(JPADynamicLinkDAO.java:43) [cd_datalayer.jar:na]
at com.tridion.storage.deploy.workers.DynamicLinkInfoTrackingWorker.doDeploy(DynamicLinkInfoTrackingWorker.java:52) [cd_datalayer.jar:na]
at com.tridion.deployer.model.transaction.TransactionLogItemWorker.doWork(TransactionLogItemWorker.java:27) [cd_model.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.runPrepare(PreCommitPhase.java:143) [cd_deployer.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.execute(PreCommitPhase.java:91) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.runMainExecutePhase(DeployPipelineExecutor.java:186) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.doExecute(DeployPipelineExecutor.java:97) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.execute(DeployPipelineExecutor.java:61) [cd_deployer.jar:na]
at com.tridion.deployer.TransactionManager.handleDeployPackage(TransactionManager.java:80) [cd_deployer.jar:na]
at com.tridion.deployer.queue.QueueLocationHandler$1.run(QueueLocationHandler.java:176) [cd_deployer.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.6.0_38]
at java.lang.Thread.run(Unknown Source) [na:1.6.0_38]
2013-01-14 19:28:01,588 WARN JDBCExceptionReporter - SQL Error: 208, SQLState: S0002
2013-01-14 19:28:01,588 ERROR JDBCExceptionReporter - Invalid object name 'DYNAMIC_LINKS'.
2013-01-14 19:28:01,588 DEBUG AbstractEntityManagerImpl - mark transaction for rollback
2013-01-14 19:28:01,588 DEBUG DynamicLinkInfoTrackingWorker - Error looking up the dynamic link dao
2013-01-14 19:28:01,588 DEBUG PreCommitPhase - Executing worker: com.tridion.storage.deploy.workers.DynamicLinkInfoTrackingWorker#76612ef6 took: 9
2013-01-14 19:28:01,588 DEBUG PreCommitPhase - Executing worker com.tridion.storage.deploy.workers.ReferenceEntryWorker#66525531 this is worker 2 of: 4
2013-01-14 19:28:01,589 DEBUG StorageManagerFactory - Loading a non cached DAO for publicationId/typeMapping/itemExtension: 12 / Reference / null
2013-01-14 19:28:01,589 DEBUG DefaultListableBeanFactory - Creating instance of bean 'JPAReferenceEntryDAO'
2013-01-14 19:28:01,589 DEBUG DefaultListableBeanFactory - Finished creating instance of bean 'JPAReferenceEntryDAO'
2013-01-14 19:28:01,589 DEBUG JPADAOFactory - Loaded DAO with type: JPAReferenceEntryDAO inside transaction: tcm:0-17-66560
2013-01-14 19:28:01,589 DEBUG StorageManagerFactory - Wrapping DAO's, currently 0 wrappers installed
2013-01-14 19:28:01,590 DEBUG AbstractBatcher - about to open PreparedStatement (open PreparedStatements: 0, globally: 0)
2013-01-14 19:28:01,590 DEBUG SQL - select referencee0_.REFERENCING_URI as REFERENC1_20_, referencee0_.REFERENCED_URI as REFERENCED2_20_, referencee0_.PUBLICATION_ID as PUBLICAT3_20_ from REFERENCE_ENTRIES referencee0_ where referencee0_.PUBLICATION_ID=? and referencee0_.REFERENCING_URI=?
2013-01-14 19:28:01,592 DEBUG AbstractBatcher - about to close PreparedStatement (open PreparedStatements: 1, globally: 1)
2013-01-14 19:28:01,594 DEBUG JDBCExceptionReporter - could not execute query [select referencee0_.REFERENCING_URI as REFERENC1_20_, referencee0_.REFERENCED_URI as REFERENCED2_20_, referencee0_.PUBLICATION_ID as PUBLICAT3_20_ from REFERENCE_ENTRIES referencee0_ where referencee0_.PUBLICATION_ID=? and referencee0_.REFERENCING_URI=?]
com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'REFERENCE_ENTRIES'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:216) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1515) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:404) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:350) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:5696) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1715) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:180) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:155) ~[sqljdbc4.jar:na]
at com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:285) ~[sqljdbc4.jar:na]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:93) ~[commons-dbcp.jar:1.2.2]
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeQuery(DelegatingPreparedStatement.java:93) ~[commons-dbcp.jar:1.2.2]
at org.hibernate.jdbc.AbstractBatcher.getResultSet(AbstractBatcher.java:208) ~[hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.getResultSet(Loader.java:1812) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doQuery(Loader.java:697) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doQueryAndInitializeNonLazyCollections(Loader.java:259) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.doList(Loader.java:2232) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.listIgnoreQueryCache(Loader.java:2129) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.Loader.list(Loader.java:2124) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.loader.hql.QueryLoader.list(QueryLoader.java:401) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.hql.ast.QueryTranslatorImpl.list(QueryTranslatorImpl.java:363) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.engine.query.HQLQueryPlan.performList(HQLQueryPlan.java:196) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.impl.SessionImpl.list(SessionImpl.java:1149) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.impl.QueryImpl.list(QueryImpl.java:102) [hibernate-core.jar:3.3.2.GA]
at org.hibernate.ejb.QueryImpl.getResultList(QueryImpl.java:67) [hibernate-entitymanager.jar:3.4.0.GA]
at com.tridion.storage.persistence.JPABaseDAO.executeQueryListResult(JPABaseDAO.java:266) [cd_datalayer.jar:na]
at com.tridion.storage.persistence.JPABaseDAO.executeQueryListResult(JPABaseDAO.java:234) [cd_datalayer.jar:na]
at com.tridion.storage.persistence.JPABaseDAO.executeQueryListResult(JPABaseDAO.java:217) [cd_datalayer.jar:na]
at com.tridion.storage.persistence.JPAReferenceEntryDAO.findByReferencingURI(JPAReferenceEntryDAO.java:80) [cd_datalayer.jar:na]
at com.tridion.storage.services.ReferenceCounter.replaceReferences(ReferenceCounter.java:160) [cd_datalayer.jar:na]
at com.tridion.storage.deploy.workers.ReferenceEntryWorker.doDeploy(ReferenceEntryWorker.java:61) [cd_datalayer.jar:na]
at com.tridion.deployer.model.transaction.TransactionLogItemWorker.doWork(TransactionLogItemWorker.java:27) [cd_model.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.runPrepare(PreCommitPhase.java:143) [cd_deployer.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.execute(PreCommitPhase.java:91) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.runMainExecutePhase(DeployPipelineExecutor.java:186) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.doExecute(DeployPipelineExecutor.java:97) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.execute(DeployPipelineExecutor.java:61) [cd_deployer.jar:na]
at com.tridion.deployer.TransactionManager.handleDeployPackage(TransactionManager.java:80) [cd_deployer.jar:na]
at com.tridion.deployer.queue.QueueLocationHandler$1.run(QueueLocationHandler.java:176) [cd_deployer.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.6.0_38]
at java.lang.Thread.run(Unknown Source) [na:1.6.0_38]
2013-01-14 19:28:01,594 WARN JDBCExceptionReporter - SQL Error: 208, SQLState: S0002
2013-01-14 19:28:01,594 ERROR JDBCExceptionReporter - Invalid object name 'REFERENCE_ENTRIES'.
2013-01-14 19:28:01,594 DEBUG AbstractEntityManagerImpl - mark transaction for rollback
2013-01-14 19:28:01,595 WARN PreCommitPhase - Failed to Prepare: tcm:0-17-66560 error: org.hibernate.exception.SQLGrammarException: could not execute query
2013-01-14 19:28:01,595 INFO StorageManagerFactory - Rolling back storage transaction: tcm:0-17-66560
2013-01-14 19:28:01,595 DEBUG FSEntityManager - Nothing to roll back for transaction tcm:0-17-66560.
2013-01-14 19:28:01,595 DEBUG FSEntityManager - Cleaning up transaction tcm:0-17-66560.
2013-01-14 19:28:01,595 DEBUG JDBCTransaction - rollback
2013-01-14 19:28:01,596 DEBUG JDBCTransaction - re-enabling autocommit
2013-01-14 19:28:01,596 DEBUG JDBCTransaction - rolled back JDBC Connection
2013-01-14 19:28:01,596 DEBUG ConnectionManager - aggressively releasing JDBC connection
2013-01-14 19:28:01,596 DEBUG ConnectionManager - releasing JDBC connection [ (open PreparedStatements: 0, globally: 0) (open ResultSets: 0, globally: 0)]
2013-01-14 19:28:01,597 WARN DeployPipelineExecutor - Phase: Deployment Prepare Commit Phase failure message: Unable to prepare transaction: tcm:0-17-66560, org.hibernate.exception.SQLGrammarException: could not execute query, org.hibernate.exception.SQLGrammarException: could not execute query for transaction: tcm:0-17-66560
2013-01-14 19:28:01,597 DEBUG DeployPipelineExecutor - Failure in Phase: Deployment Prepare Commit Phase attempt: 11 for transaction: tcm:0-17-66560
2013-01-14 19:28:01,597 ERROR DeployPipelineExecutor - Final attempt in Phase: Deployment Prepare Commit Phase failed for transaction: tcm:0-17-66560
2013-01-14 19:28:01,598 ERROR DeployPipelineExecutor - Original stacktrace for transaction: tcm:0-17-66560
com.tridion.deployer.ProcessingException: Unable to prepare transaction: tcm:0-17-66560, org.hibernate.exception.SQLGrammarException: could not execute query, org.hibernate.exception.SQLGrammarException: could not execute query
at com.tridion.deployer.phases.PreCommitPhase.handleFailure(PreCommitPhase.java:120) ~[cd_deployer.jar:na]
at com.tridion.deployer.phases.PreCommitPhase.execute(PreCommitPhase.java:101) ~[cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.runMainExecutePhase(DeployPipelineExecutor.java:186) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.doExecute(DeployPipelineExecutor.java:97) [cd_deployer.jar:na]
at com.tridion.deployer.phases.DeployPipelineExecutor.execute(DeployPipelineExecutor.java:61) [cd_deployer.jar:na]
at com.tridion.deployer.TransactionManager.handleDeployPackage(TransactionManager.java:80) [cd_deployer.jar:na]
at com.tridion.deployer.queue.QueueLocationHandler$1.run(QueueLocationHandler.java:176) [cd_deployer.jar:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.FutureTask.run(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source) [na:1.6.0_38]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) [na:1.6.0_38]
at java.lang.Thread.run(Unknown Source) [na:1.6.0_38]
2013-01-14 19:28:01,598 INFO TransactionPersistence - Removing deployment transaction information: tcm:0-17-66560
2013-01-14 19:28:01,601 ERROR DeployPipelineExecutor - Unable to start processing deployment package with transactionId: tcm:0-17-66560
2013-01-14 19:28:01,606 DEBUG DeployPipelineExecutor - Checking if transaction is completed: tcm:0-17-66560 is true
2013-01-14 19:28:01,606 INFO DeployPipelineExecutor - Transaction is completed: tcm:0-17-66560
2013-01-14 19:28:01,616 INFO DeployPipelineExecutor - Finished executing deployment pipeline for: tcm:0-17-66560 in 30231 ms.
2013-01-14 19:28:01,616 INFO TransactionManager - Cleaning up Deployment package for transaction: tcm:0-17-66560 and type: CONTENT
2013-01-14 19:28:01,621 INFO TransactionManager - Finished handling of Deployment package: tcm:0-17-66560 with type: CONTENT
2013-01-14 19:28:01,622 DEBUG QueueLocationHandler - Removing exclusive lock on Deployment package: tcm:0-17-66560 with type: CONTENT.
2013-01-14 19:28:02,664 DEBUG HttpUploadReceiver - transactionId parameter contained an invalid TCM URI string: meta.xml, processing as normal file request
2013-01-14 19:28:02,665 INFO HttpUploadReceiver - File found at C:\tridion\incoming\meta.xml for meta.xml
2013-01-14 19:28:02,699 DEBUG HttpUploadReceiver - transactionId parameter contained an invalid TCM URI string: tcm_0-17-66560.state.xml, processing as normal file request
2013-01-14 19:28:02,700 INFO HttpUploadReceiver - File found at C:\tridion\incoming\tcm_0-17-66560.state.xml for tcm_0-17-66560.state.xml
2013-01-14 19:28:02,700 INFO HttpUploadReceiver - Removed file at C:\tridion\incoming\tcm_0-17-66560.state.xml
there are so many points of interest here I'm not even sure where it goes wrong...
Like Ram G already mentioned, this is a problem with the used database which seems to miss tables used by Content Delivery. You're probably using a CM database instead of a CD one. Please make sure that your database is correctly created and retry your test.
Hope this helps.

Resources