Why is R not connecting to Hadoop ?
I am using R to connect to HDFS using 'rhdfs' package. The 'rJava' package is installed and rhdfs package is loaded.
The HADOOP_CMD environment variable is set in R using:
Sys.setenv(HADOOP_CMD='/usr/local/hadoop/bin')
But when hdfs.init() function is given, the following error message is generated:
sh: 1: /usr/local/hadoop/bin: Permission denied
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.ClassNotFoundException
In addition: Warning message:
running command '/usr/local/hadoop/bin classpath' had status 126
Also, 'rmr2' library was loaded, and the following code was typed:
ints = to.dfs(1:100)
which generated the message given below:
sh: 1: /usr/local/hadoop/bin: Permission denied
The R-Hadoop packages are accessible only to the 'root' user and not 'hduser' (Hadoop user), since they were installed when R was run by the 'root' user.
Simple, only 2 reasons to get this type of problem
1) Wrong path
2) No privileges/permissions to that jar ok
not only that include other system paths. such as given below.
Sys.setenv(HADOOP_HOME="/home/hadoop/path")
Sys.setenv(HADOOP_CMD="/home/hadoop/path/bin/hadoop")
Sys.setenv(HADOOP_STREAMING="/home/hadoop/path/streaming-jar-file.jar")
Sys.setenv(JAVA_HOME="/home/hadoop/java/path")
Then include ibrary(rmr2) and library(rhdfs) paths, surely that error don't occur.
But your problem is Permission problem. So as a root grant all privileges (755) to you then run that jar file, surely that error don't display.
try like this.
Sys.setenv(HADOOP_CMD='/usr/local/hadoop/bin/hadoop')
Sys.setenv(JAVA_HOME='/usr/lib/jvm/java-6-openjdk-amd64')
library(rhdfs)
hdfs.init()
please give the correct HADOOP_CMD path extend with /bin/hadoop
Related
I'm updating the renv folder from a project in order to adjust the libraries, but it seems I'm having a permission problem. After running renv::init() and trying to installing manually the remaining libraries using install.packages() I always get the message
Error: failed to retrieve 'https://cran.rstudio.com/bin/windows/contrib/4.2/ipeadatar_0.1.6.zip' [error code 23]
1: curl: (23) Failure writing output to destination
2: curl: (23) Failure writing output to destination
Using .libPath() I can see that the renv was created in the "AppData" hidden folder
1] "C:/Users/André Ferreira/AppData/Local/R/cache/R/renv/library/MacroBRA_Wrld-09789847/R-4.2/x86_64-w64-mingw32"
So checking my permissions, I couldn't see anything wrong. Any thoughts about this problem? The thing it's that when I open my .Rmd file and try to knit, I receive the same message "1: curl: (23) Failure writing output to destination", now from rmarkdown retrieve installation, so it may be a configuration/permission problem.
Adding "C:\rtools42\usr\bin" and "C:\Program Files\R\R-4.2.1\bin" in the environment variable didn't help.
As I could see, opening an empty file from rstudio, I could use install.packages() without problem.
Although this doesn't solve the problem directly, you can also instruct renv to use a different library path with something like:
# use a project-local library path
RENV_PATHS_LIBRARY = renv/library
in your project's .Renviron file. Depending on your environment, you might also consider placing the library path in an alternate location.
See https://rstudio.github.io/renv/articles/packages.html#r-cmd-build-and-the-project-library for more details.
I am getting an error when trying to run the diskImageR package, specifically the IJMacro function, regarding an inability to locate ImageJ. This is what I think the error is stating although I do not know for sure.
I already tried changing the path and by following the pdf associated with running the package, but I still get the same error.
IJMacro("newProject",imageJLoc ="C:\\Users\\user\\Desktop\\ImageJ")
[1] "Searching for application name or filepath: ImageJ"
Error in ij$runScript(paste(script, IJarguments)) :
The imageJ binaries have not been located. Re-initialise the imageJInterface object with the correct location for the imageJ binaries
In addition: Warning message:
In setFilePath(filePath) :
The ImageJ application could not be found in the common install location on your system
In this tutorial, there is a command pymol.dccm(cij, pdb, type="launch"). But I was told
> pymol.dccm(cij, pdb, type="launch")
Error in pymol.dccm(cij, pdb, type = "launch") :
Launching external program failed
make sure 'C:/python27/PyMOL/pymol.exe' is in your search path
In addition: Warning message:
running command 'C:/python27/PyMOL/pymol.exe -cq' had status 127
I already have pymol installed on my PC. Can I ask how to add another search path to R?
Now I think pymol is a sub-package in bio3d. But I already installed bio3d and other commands can work (e.g. pdb <- read.pdb()). But why the pymol command could not work?
I tried
> .libPaths("path/to/pymol2/")
> .libPaths("path/to/pymol2/PyMOL")
> .libPaths("path/to/pymol2/PyMOL/PyMOLWin.exe")
> pymol.dccm(cij, pdb, type="launch")
Error in pymol.dccm(cij, pdb, type = "launch") :
Launching external program failed
make sure 'C:/python27/PyMOL/pymol.exe' is in your search path
In addition: Warning message:
running command 'C:/python27/PyMOL/pymol.exe -cq' had status 127
> PyMOLWin.dccm(cij, pdb, type="launch")
Error: could not find function "PyMOLWin.dccm"
So the .libPaths did not return error. But pymol.dccm and PyMOLWin.dccm did not work.
I also tried to install pymol package in R
> install.packages("pymol")
Warning in install.packages :
package ‘pymol’ is not available (for R version 3.2.2)
There's a mistake in the tutorial command itself. The correct syntax for dccm is
pymol(cij, pdb, type="launch",exefile="C:/Program Files/pymol")
where exefile = file path to the ‘PYMOL’ program on your system (i.e. how is ‘PYMOL’ invoked). If NULL, use OS-dependent default path to the program.
Try the following code, it worked perfectly for me:
pymol(cm, pdb.open, type="launch", exefile="%userprofile%/PyMOL/PyMOLWin.exe")
.libPaths("path/to/package/library") probably does what you need.
.libPaths gets/sets the library trees within which packages are looked for.
Set the path to the parent directory of the directory with the package name rather than the package directory itself.
After installing sparklyr package I followed the instruction here ( http://spark.rstudio.com/ ) to connect to spark. But faced with this error. Am I doing something wrong. Please help me.
sc = spark_connect( master = 'local' )
Error in file(con, "r") : cannot open the connection
In addition: Warning message:
In file(con, "r") :
cannot open file 'C:\Users\USER\AppData\Local\Temp\RtmpYb3dq4\fileff47b3411ae_spark.log':
Permission denied
But I am able to find the file at the stated location. And on opening, I found it to be empty.I
First of all, did you install sparklyr from github devtools::install_github("rstudio/sparklyr") or CRAN?
There were some issues some time ago with Windows installations.
The issue you have seems to be related to TEMP and TMP folder level permission on Windows or to file creation permission. Every time you start sc <- spark_connect(), it tries to create a folder and file to write the log files.
Make sure you have a write access to these locations.
I could observe the same error message with version 2.4.3 and 2.4.4
in different cases:
When trying to connect to a non "local" master, using spark_connect(master="spark://192.168.0.12:7077", ..),
if the master is not started or not responding at the specified master url.
when setting a specific incomplete configuration
in my case trying to set dynamicAllocation to true, without other required dynamicAllocation settings:
conf <- spark_config()
conf$spark.dynamicAllocation.enabled <- "true"
Why is the R Mapreduce library 'rmr2' generating a warning message ?
I have installed 'rmr2' library to execute Mapreduce programs in R. But when
library(rmr2)
is specified in R, it generates the following warning message:
Please review your hadoop settings. See help(hadoop.settings)
Warning message:
S3 methods ‘gorder.default’, ‘gorder.factor’, ‘gorder.data.frame’, ‘gorder.matrix’, ‘gorder.raw’
were declared in NAMESPACE but not found
What could be the reason ?
The main reason, you didn't include the path. before run library(rmr2), you must include the given 4 paths to prevent these type warnings.
Sys.setenv(HADOOP_HOME="/home/hadoop/hadoop-1.1.2") //Its hadooop path
Sys.setenv(HADOOP_CMD="/home/hadoop/hadoop-1.1.2/bin/hadoop") //It's CMD path
Sys.setenv(HADOOP_STREAMING="/home/hadoop/work/hadoop-1.1.2/contrib/streaming/hadoop-streaming-1.1.2.jar") //It's streaming path
Sys.setenv(JAVA_HOME="/usr/lib/jvm/java-1.6.0-openjdk-amd64") //Java path it is.
Than you include library(rmr2) and library(rhdfs) to do further process. All the best.
I think you didn't write the paths as it should be:
HADOOP_CMD='/usr/local/hadoop-2.7.2/bin/hadoop'
HADOOP_STREAMING='/usr/local/hadoop-2.7.2/share/hadoop/tools/lib/hadoop-streaming-2.7.2.jar'
HADOOP_HOME='/usr/local/hadoop-2.7.2'
the '' are very important, check if you forgot them.