Solr server not available on: https://localhost:8983/solr/ - unix

I am new to Solr. I have a large csv file in my hdfs location and I am trying to index it using Apache Solr. But it throws out an error. Below is the command I am using.
hadoop jar jobjar/connectors/plugins/lucid.hadoop/hadoop-job/hadoop-lws-job-2.0.0-rc2-0.jar com.lucidworks.hadoop.ingest.IngestJob -Dlww.commit.on.close=true -DcsvFieldMapping=0=id,1=text -cls com.lucidworks.hadoop.ingest.CSVIngestMapper -c hdp1 -i /tmp/neethu/solr/mydata.csv -of com.lucidworks.hadoop.io.LWMapRedOutputFormat -s https://localhost:8983/solr/
Error message is :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
15/03/12 15:52:27 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
Solr server not available on: https://localhost:8983/solr/
I am unable to view the solr admin page whenever i copy paste the url https://localhost:8983/solr/ in my browser. It says This webpage is not available and I think this could be the reason that I am getting the above error.Could any one tell me why I am unable to view the solr admin page on my browser. I tried a lot on this.

Related

Why does SLF4J with slf4j-nop output the StaticLoggerBinder warning when using a Groovy script with #Grab

I have a Groovy script which specifies dependencies using the Grape #Grab annotation. Included with that are a dependency on spring-web to use RestTemplate, and a dependency on slf4j-nop to avoid the Failed to load class "org.slf4j.impl.StaticLoggerBinder" warning.
#!/usr/bin/env groovy
#Grab('org.springframework:spring-web:5.3.18')
#Grab('org.slf4j:slf4j-nop:1.7.36')
import org.springframework.web.client.RestTemplate
new RestTemplate().getForObject('http://www.example.com', String)
However, despite this, I am still getting the SLF4J warning:
$ ./restTemplateLoggingTest.groovy
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Given that it's a script, it's important that the output not contain extraneous noise, as it may be used and manipulated programmatically.
What can I do to prevent this logging warning from being output when I run my script?
Experimentation has shown that attaching the dependencies to the system classloader using #GrabConfig(systemClassLoader=true) causes the logs to no longer be emitted:
#!/usr/bin/env groovy
#GrabConfig(systemClassLoader=true)
#Grab('org.springframework:spring-web:5.3.18')
#Grab('org.slf4j:slf4j-nop:1.7.36')
import org.springframework.web.client.RestTemplate
new RestTemplate().getForObject('http://www.example.com', String)
I don't know for sure why this is the cause, though I have some vague guesses.
Note that despite addressing the issue, this isn't a use that's described by the Javadocs for GrabConfig#systemClassLoader:
Set to true if you want to use the system classloader when loading the grape. This is normally only required when a core Java class needs to reference the grabbed classes, e.g. for a database driver accessed using DriverManager.

failed to fetch nginx packages

Failed to fetch http://nginx.org/packages/mainline/debian/dists/jessie/InRelease Unable to find expected entry 'nginx/binary-armhf/Packages' in Release file (Wrong sources.list entry or malformed file)
Armhf packages are not built by nginx.org. List of supported architectures: http://nginx.org/en/linux_packages.html#mainline

Avoiding mapred.child.env modification at runtime on HDP so that R can establish connection to hiveserver2 using RHive

I'm trying to get R's RHive package to communicate nicely with hiveserver2.
I receive an error while trying to connect into hiveserver2 using:
>rhive.connect(host="localhost",port=10000, hiveServer2=TRUE, user="root", password="hadoop")
The output on the initial run:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hadoop/client/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hadoop/client/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hadoop/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hive/lib/hive-jdbc-0.14.0.2.2.0.0-2041-standalone.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.2.0.0-2041/hive/lib/hive-jdbc.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/03/19 07:08:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/03/19 07:08:23 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.
15/03/19 07:08:24 INFO jdbc.Utils: Supplied authorities: localhost:10000
15/03/19 07:08:24 INFO jdbc.Utils: Resolved authority: localhost:10000
15/03/19 07:08:24 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default
This leads to the error:
Error: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Cannot modify mapred.child.env at runtime. It is not in list of params that are allowed to be modified at runtime
On subsequent runs of the same command the output reduces to:
15/03/19 07:16:24 INFO jdbc.Utils: Supplied authorities: localhost:10000
15/03/19 07:16:24 INFO jdbc.Utils: Resolved authority: localhost:10000
15/03/19 07:16:24 INFO jdbc.HiveConnection: Will try to open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default
Error: org.apache.hive.service.cli.HiveSQLException: Error while processing statement: Cannot modify mapred.child.env at runtime. It is not in list of params that are allowed to be modified at runtime
This indicates to me that I may have insufficient permissions somewhere... However, I'm running this using root. So, I'm unsure of what permissions I'm missing...
I've installed RHive using the installation guidelines via README.
NOTE: The same error occurs if I use the CRAN version of the package.
I'm currently using Hortonworks Data Platform 2.2 (HDP 2.2)'s virtual box image. As a result, hadoop and hiveserver2 are already installed. I've installed R version 3.1.2.
The following is how I am installing RHive:
# Set up paths for HIVE_HOME, HADOOP_HOME, and HADOOP_CONF
export HIVE_HOME=/usr/hdp/2.2.0.0-2041/hive
export HADOOP_HOME=/usr/hdp/2.2.0.0-2041/hadoop
export HADOOP_CONF_DIR=/etc/hadoop/conf
# R Location via RHOME
R_HOME=/usr/lib64/R
# Place R_HOME into hadoop config location
sudo sh -c "echo \"R_HOME='$R_HOME'\" >> $HADOOP_HOME/conf/hadoop-env.sh"
# Add remote enable to Rserve config.
sudo sh -c "echo 'remote enable' >> /etc/Rserv.conf"
# Launch the daemon
R CMD Rserve
# Confirm launch
netstat -nltp
# Install ant to build java files
sudo yum -y install ant
# Install package dependencies
sudo R --no-save << EOF
install.packages( c('rJava','Rserve','RUnit'), repos='http://cran.us.r-project.org', INSTALL_opts=c('--byte-compile') )
EOF
# Install RHive package
git clone https://github.com/nexr/RHive.git
cd RHive
ant build
sudo R CMD INSTALL RHive
To check either open R and use the statements between EOF or just run the command directly from shell:
sudo R --no-save << EOF
Sys.setenv(HIVE_HOME="/usr/hdp/2.2.0.0-2041/hive")
Sys.setenv(HADOOP_HOME="/usr/hdp/2.2.0.0-2041/hadoop")
Sys.setenv(HADOOP_CONF_DIR="/etc/hadoop/conf")
library(RHive)
rhive.connect(host="localhost",port=10000, hiveServer2=TRUE, user="root", password="hadoop")
EOF
The answer is mentioned at this link.
Basically, you have to add a property "hive.security.authorization.sqlstd.confwhitelist.append" with value "mapred.child.env" in /etc/hive/conf/hive-site.xml
This solution worked for me, but I used Ambari UI to make this configuration change.

rhive.connect error - file:///rhive/lib/2.0-0.0/rhive_udf.jar does not exist

I am facing an error while connecting R with Hive using the rhive package. The package was installed perfectly but it is returning error while using rhive.connect. Please note the following:
Rserve is running as a daemon
R and Hive are installed on separate servers but within the same cluster
RHive was built from source using git. The version is 2.0-0.0
I am connecting to hiveserver running on port 10000
The error message says "file:///rhive/lib/2.0-0.0/rhive_udf.jar does not exist" although the file is there (in linux directory) and the entire directory and file has full permissions.
Below is the snapshot of the error:
library(RHive)
Loading required package: rJava
Loading required package: Rserve
rhive.env()
hadoop home: /opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop
hive home: /opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hive
rhive.connect("10.0.192.108")
14/07/04 00:45:51 INFO Configuration.deprecation: fs.default.name is deprecated. Instead, use fs.defaultFS
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/client-0.20/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/client/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-5.0.2-1.cdh5.0.2.p0.13/lib/hadoop/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
14/07/04 00:45:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning:
+----------------------------------------------------------+
+ / hiveServer2 argument has not been provided correctly. +
+ / RHive will use a default value: hiveServer2=TRUE. +
+----------------------------------------------------------+
Error: java.sql.SQLException: Error while processing statement: file:///rhive/lib/2.0-0.0/rhive_udf.jar does not exist.
Can someone please help? Thank you.
You need to specify defaultFS parameter. RHive tries to write to filesystem instead of HDFS if you don't give defaultFS parameter.
rhive.connect("10.0.192.108", defaultFS="hdfs://10.0.192.108:8020")

swagger-document-override/md-override-loader - FAILED

I am able to generate code using Autorest with my api when I host it on a server 2012R2 running IIS
However when I try to run it with the localhost url I get a could not read message.
I can read swagger.json in the browser
I am using the command
autorest --input-file=https://localhost:44348/api-docs/v1/swagger.json
--output-folder=generated --csharp --namespace=DD.MyApp.Connector
The output is
AutoRest code generation utility [version: 2.0.4283; node: v10.11.0]
(C) 2018 Microsoft Corporation.
https://aka.ms/autorest
Loading AutoRest core 'C:\Users\kirst\.autorest\#microsoft.azure_autorest-core#2.0.4289\node_modules\#microsoft.azure\autorest-core\dist' (2.0.4289)
Loading AutoRest extension '#microsoft.azure/autorest.csharp' (~2.3.79->2.3.82)
Loading AutoRest extension '#microsoft.azure/autorest.modeler' (2.3.55->2.3.55)
FATAL: swagger-document-override/md-override-loader - FAILED
FATAL: Error: Could not read 'https://localhost:44348/api-docs/v1/swagger.json'.
FATAL: swagger-document/loader - FAILED
FATAL: Error: Could not read 'https://localhost:44348/api-docs/v1/swagger.json'.
Process() cancelled due to exception : Could not read 'https://localhost:44348/api-docs/v1/swagger.json'.
Error: Could not read 'https://localhost:44348/api-docs/v1/swagger.json'.
After studying the issue on github I tried starting the api using dotnet run
but it did not help.
I tried running autorest in a dos command shell with admin priviledge
As per the github issue I can save the swagger.json to a file and generate the code by referencing the file.
That isn't a great solution.
Are you attempting to use https with localhost ? If you haven't put a certificate on there, you shouldn't be using https.
Try it with http://localhost...

Resources