i am getting SLF4J error while booting up corda netowrk,
I am using JDK8_221, corda 4.1, windows OS,
I am using gradle
Logs:
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation SLF4J:
See http://www.slf4j.org/codes.html#StaticLoggerBinder for further
detail s.
tried same with JDK8_181, JDK8_191
Env. variable and path variable are correct.
This issue could be due to the version of SLF4J.
Use 1.6 or 1.7 version of SLF4J.
Go through the following link for details
URL: https://www.slf4j.org/api/org/slf4j/impl/StaticLoggerBinder.html
Excerpts from the above link -
As of SLF4J version 1.8.0, the static binder mechanism (StaticLoggerBinder) is deprecated.
This class exists for backward compatibility earlier versions of slf4j-api.jar, in particular in the 1.6.x and the 1.7.x series.
Note that this class is likely to be removed in future releases of SLF4J.
Related
I have a Groovy script which specifies dependencies using the Grape #Grab annotation. Included with that are a dependency on spring-web to use RestTemplate, and a dependency on slf4j-nop to avoid the Failed to load class "org.slf4j.impl.StaticLoggerBinder" warning.
#!/usr/bin/env groovy
#Grab('org.springframework:spring-web:5.3.18')
#Grab('org.slf4j:slf4j-nop:1.7.36')
import org.springframework.web.client.RestTemplate
new RestTemplate().getForObject('http://www.example.com', String)
However, despite this, I am still getting the SLF4J warning:
$ ./restTemplateLoggingTest.groovy
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Given that it's a script, it's important that the output not contain extraneous noise, as it may be used and manipulated programmatically.
What can I do to prevent this logging warning from being output when I run my script?
Experimentation has shown that attaching the dependencies to the system classloader using #GrabConfig(systemClassLoader=true) causes the logs to no longer be emitted:
#!/usr/bin/env groovy
#GrabConfig(systemClassLoader=true)
#Grab('org.springframework:spring-web:5.3.18')
#Grab('org.slf4j:slf4j-nop:1.7.36')
import org.springframework.web.client.RestTemplate
new RestTemplate().getForObject('http://www.example.com', String)
I don't know for sure why this is the cause, though I have some vague guesses.
Note that despite addressing the issue, this isn't a use that's described by the Javadocs for GrabConfig#systemClassLoader:
Set to true if you want to use the system classloader when loading the grape. This is normally only required when a core Java class needs to reference the grabbed classes, e.g. for a database driver accessed using DriverManager.
We've built a CordApp which connects with RabbitMQ and requires a configuration file to configure the subscribers and publishers to the message queues.
The CorDapp is build against 3.2-corda but fails to load the configuration file when running on a Corda Enterprise 3.2 node.
The following exception is appended to the logs when we start the Corda web server:
Starting as webserver: localhost:8080 [ERROR] 11:59:24+0000 [main]
messaging.XXX.initializeQueues - Exception caught when subscribing to
Rabbit queues [ERROR] 11:59:24+0000 [main]
messaging.XXX.initializeQueues -
net.corda.nodeapi.internal.config.ConfigUtilities.parseAs(Lcom/typesafe/config/Config;Lkotlin/reflect/KClass;)Ljava/lang/Object;
java.lang.NoSuchMethodError:
net.corda.nodeapi.internal.config.ConfigUtilities.parseAs(Lcom/typesafe/config/Config;Lkotlin/reflect/KClass;)Ljava/lang/Object;
Nov 27, 2018 11:59:25 AM
org.glassfish.jersey.internal.inject.Providers checkProviderRuntime
The code that loads the configuration is as follows:
val connectionConfig = defaultConfig!!
.resolve()
.getConfig("app-integration.rabbitMqConnectionConfiguration")
.parseAs<RabbitMqConnectionConfiguration>()
Given we are using a generic parseAs<RabbitMqConnectionConfiguration>() method, we assume that this is intended to subsequently call a parseAs(Config, KClass): Object method, but for some reason, it seems to be missing?
I would actually take a look at #joel's suggestion above. Corda is not an 'open core' project just yet.
So it's possible that you should compile the cordapps against the enterprise jar to determine whether the issue is the app or a different API between the two corda versions.
My other suspicion is that there might be different versions of java that are being used in the corda OS and the corda enterprise.
I would also see if this issue persists in the latest version of corda as well.
the SparkContext in SparkR (v1.5.1) is a
Java ref type org.apache.spark.api.java.JavaSparkContext
however when creating my class:
.jnew("com.example.MyClass","sc")
for my scala class: class TableReader(sc: JavaSparkContext), I'm getting a: java.lang.NoSuchMethodError:
What is this "Java ref type" and how can I get the actual context from it to send through rJava?
SparkR seems to have its own Java interoperability implemented in backend.R. Calls are made in the form SparkR:::newJObject(className, args), though I can't find any specific documentation, other than in tests in the same project.
sqlContext needs to be initialized and relevant jars loaded during startup using --jars {csv jars} or --packages as noted in the documentation.
I am new to Solr. I have a large csv file in my hdfs location and I am trying to index it using Apache Solr. But it throws out an error. Below is the command I am using.
hadoop jar jobjar/connectors/plugins/lucid.hadoop/hadoop-job/hadoop-lws-job-2.0.0-rc2-0.jar com.lucidworks.hadoop.ingest.IngestJob -Dlww.commit.on.close=true -DcsvFieldMapping=0=id,1=text -cls com.lucidworks.hadoop.ingest.CSVIngestMapper -c hdp1 -i /tmp/neethu/solr/mydata.csv -of com.lucidworks.hadoop.io.LWMapRedOutputFormat -s https://localhost:8983/solr/
Error message is :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
15/03/12 15:52:27 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
Solr server not available on: https://localhost:8983/solr/
I am unable to view the solr admin page whenever i copy paste the url https://localhost:8983/solr/ in my browser. It says This webpage is not available and I think this could be the reason that I am getting the above error.Could any one tell me why I am unable to view the solr admin page on my browser. I tried a lot on this.
I am facing an error while connecting R with Hive using the rhive package. The package was installed perfectly but it is returning error while using rhive.connect. Please note the following:
Rserve is running as a daemon
R and Hive are installed on separate servers but within the same cluster
RHive was built from source using git. The version is 2.0-0.0
I am connecting to hiveserver running on port 10000
The error message says "file:///rhive/lib/2.0-0.0/rhive_udf.jar does not exist" although the file is there (in linux directory) and the entire directory and file has full permissions.
Below is the snapshot of the error:
library(RHive)
Loading required package: rJava
Loading required package: Rserve
rhive.env()
hadoop home: /opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop
hive home: /opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hive
rhive.connect("10.0.192.108")
14/07/04 00:45:51 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/client-0.20/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/client/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/07/04 00:45:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning:
+----------------------------------------------------------+
+ / hiveServer2 argument has not been provided correctly. +
+ / RHive will use a default value: hiveServer2=TRUE. +
+----------------------------------------------------------+
Error: java.sql.SQLException: Error while processing statement: file:///rhive/lib/2.0-0.0/rhive_udf.jar does not exist.
Can someone please help? Thank you.
You need to specify defaultFS parameter. RHive tries to write to filesystem instead of HDFS if you don't give defaultFS parameter.
rhive.connect("10.0.192.108", defaultFS="hdfs://10.0.192.108:8020")