In this tutorial, there is a command pymol.dccm(cij, pdb, type="launch"). But I was told
> pymol.dccm(cij, pdb, type="launch")
Error in pymol.dccm(cij, pdb, type = "launch") :
Launching external program failed
make sure 'C:/python27/PyMOL/pymol.exe' is in your search path
In addition: Warning message:
running command 'C:/python27/PyMOL/pymol.exe -cq' had status 127
I already have pymol installed on my PC. Can I ask how to add another search path to R?
Now I think pymol is a sub-package in bio3d. But I already installed bio3d and other commands can work (e.g. pdb <- read.pdb()). But why the pymol command could not work?
I tried
> .libPaths("path/to/pymol2/")
> .libPaths("path/to/pymol2/PyMOL")
> .libPaths("path/to/pymol2/PyMOL/PyMOLWin.exe")
> pymol.dccm(cij, pdb, type="launch")
Error in pymol.dccm(cij, pdb, type = "launch") :
Launching external program failed
make sure 'C:/python27/PyMOL/pymol.exe' is in your search path
In addition: Warning message:
running command 'C:/python27/PyMOL/pymol.exe -cq' had status 127
> PyMOLWin.dccm(cij, pdb, type="launch")
Error: could not find function "PyMOLWin.dccm"
So the .libPaths did not return error. But pymol.dccm and PyMOLWin.dccm did not work.
I also tried to install pymol package in R
> install.packages("pymol")
Warning in install.packages :
package ‘pymol’ is not available (for R version 3.2.2)
There's a mistake in the tutorial command itself. The correct syntax for dccm is
pymol(cij, pdb, type="launch",exefile="C:/Program Files/pymol")
where exefile = file path to the ‘PYMOL’ program on your system (i.e. how is ‘PYMOL’ invoked). If NULL, use OS-dependent default path to the program.
Try the following code, it worked perfectly for me:
pymol(cm, pdb.open, type="launch", exefile="%userprofile%/PyMOL/PyMOLWin.exe")
.libPaths("path/to/package/library") probably does what you need.
.libPaths gets/sets the library trees within which packages are looked for.
Set the path to the parent directory of the directory with the package name rather than the package directory itself.
Related
I have installed h2o package(in R from RStudio console). Post h2o.init() I am trying to use the built in function upload_model()/upload_mojo() but I am getting following error.
h2o.upload_mojo()
Error in h2o.upload_mojo() : could not find function "h2o.upload_mojo"
h2o.upload_model()
Error in h2o.upload_model() : could not find function "h2o.upload_model"
I found work around to resolve this issue. Please find below the steps I followed:
Remove the package using: remove.packages("h2o")
Quit the current session and launch the new one.
Move out lock file for h2o from path where package was installed mostly under R with file name like - 00LOCK-h2o
Install new/latest version of package via RStudio console using install.packages()
It should now resolve this issue.
I'm trying to load papaja in R (version 3.6.0). I'm running Windows 10 on my computer. When I try to run devtools::install_github("crsh/papaja") I get the following error message:
package ‘markdown’ successfully unpacked and MD5 sums checked
The downloaded binary packages are in
C:\Users\My
Name\AppData\Local\Temp\RtmpKCmBDG\downloaded_packages
ERROR
cannot change to directory 'C:\Users\My'
The system cannot find the path specified.
Error in (function (command = NULL, args = character(), error_on_status =
TRUE, :
System command error
From some research, including this post and this post, I realise that this is a common issue, but none of the answers I can find help me work around this in papaja.
For info, I don't have this problem when installing other packages using install.packages().
I have managed to solve this problem after hours and hours of googling. The problem is not papaja; it is the presence of spaces in the path, which devtools does not deal with well.
To solve this, I first changed my library path:
.libPaths("C:/Program Files/R/R-3.6.1/library") # for R v.3.6.1
At first I couldn't get this to work - I kept getting the error 'lib = "C:/Program Files/R/R-3.6.1/library"' is not writable. Basically, this is because I did not have permission to write into this folder. To fix this, you simply close RStudio, go to the folder where the programme is saved and right click, then select 'Run as administrator'.
Why is R not connecting to Hadoop ?
I am using R to connect to HDFS using 'rhdfs' package. The 'rJava' package is installed and rhdfs package is loaded.
The HADOOP_CMD environment variable is set in R using:
Sys.setenv(HADOOP_CMD='/usr/local/hadoop/bin')
But when hdfs.init() function is given, the following error message is generated:
sh: 1: /usr/local/hadoop/bin: Permission denied
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.ClassNotFoundException
In addition: Warning message:
running command '/usr/local/hadoop/bin classpath' had status 126
Also, 'rmr2' library was loaded, and the following code was typed:
ints = to.dfs(1:100)
which generated the message given below:
sh: 1: /usr/local/hadoop/bin: Permission denied
The R-Hadoop packages are accessible only to the 'root' user and not 'hduser' (Hadoop user), since they were installed when R was run by the 'root' user.
Simple, only 2 reasons to get this type of problem
1) Wrong path
2) No privileges/permissions to that jar ok
not only that include other system paths. such as given below.
Sys.setenv(HADOOP_HOME="/home/hadoop/path")
Sys.setenv(HADOOP_CMD="/home/hadoop/path/bin/hadoop")
Sys.setenv(HADOOP_STREAMING="/home/hadoop/path/streaming-jar-file.jar")
Sys.setenv(JAVA_HOME="/home/hadoop/java/path")
Then include ibrary(rmr2) and library(rhdfs) paths, surely that error don't occur.
But your problem is Permission problem. So as a root grant all privileges (755) to you then run that jar file, surely that error don't display.
try like this.
Sys.setenv(HADOOP_CMD='/usr/local/hadoop/bin/hadoop')
Sys.setenv(JAVA_HOME='/usr/lib/jvm/java-6-openjdk-amd64')
library(rhdfs)
hdfs.init()
please give the correct HADOOP_CMD path extend with /bin/hadoop
OS- Windows 7
R version 3.0.3
What I typed on R console :
if(!file.exists("data")){dir.create("data")}
fileUrl <- "https://data.baltimorecity.gov/api/views/dz54-2aru/rows.xlsx?accessType=DOWNLOAD"
download.file(fileUrl,destfile="./data/cameras.xlsx",method="curl")
dateDownloaded <- date()
What I got on R console
Warning messages:
1: running command 'curl "https://data.baltimorecity.gov/api/views/dz54-2aru/rows.xlsx?accessType=DOWNLOAD" -o "./data/cameras.xlsx"' had status 127
2: In download.file(fileUrl, destfile = "./data/cameras.xlsx", method = "curl") :
download had nonzero exit status
How do I make it right ?
Your code works for me. You probably don't have cURL installed on your system. See ?download.file:
For methods "wget", "curl" and "lynx" a system call is made to the tool given by method, and the respective program must be installed on your system and be in the search path for executables.
Install cURL and make sure it is in your path.
After removing the method="curl" parameter, it works well on Windows.
I installed R 2.15.2 on Windows PC.
Hadoop & Hive are on another PC.
I loaded RHive and its dependencies in to R.
Now i am trying to connect to Hive.
> Sys.setenv(HIVE_HOME="/home/hadoop/hive-0.7.0-cdh3u0")
> Sys.setenv(HADOOP_HOME="/home/hadoop/hadoop-0.20.2-cdh3u0")
> library(RHive)
> rhive.env(ALL=TRUE)
Hive Home Directory : /home/hadoop/hive-0.7.0-cdh3u0
Hadoop Home Directory : /home/hadoop/hive-0.7.0-cdh3u0
Hadoop Conf Directory :
No RServe
Disconnected HiveServer and HDFS
RHive Library List
C:/Program Files/R/R-2.15.2/library/RHive/java/rhive_udf.jar /home/hadoop/hive-0.7.0-cdh3u0/conf
> rhive.init()
[1] "there is no slaves file of HADOOP. so you should pass hosts argument when you call rhive.connect()."
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.ClassNotFoundException
In addition: Warning message:
In file(file, "rt") :
cannot open file '/home/hadoop/hadoop-0.20.2-cdh3u0/conf/slaves': No such file or directory
> rhive.connect(hdfsurl="hdfs://212.63.135.149:9000/")
Error in .jnew("org/apache/hadoop/conf/Configuration") :
java.lang.ClassNotFoundException
The result is error in connection!
even tried
rhive.connect(host = "212.63.135.149", port = 10000, hdfsurl="hdfs://212.63.135.149:9000/") , but no use.
I had the same problem a few weeks ago when installing RHive. It is because some jar files are not in the classpath which is set in rhive.init.
You need to set the arguments hive, libs, hadoop_home, hadoop_conf, hlibs which indicate where these jar files are located.
I first installed from source, that worked with rhive.init but rhive.connect did not work properly. It did work like a charm when I installed Hive through the Cloudera manager https://ccp.cloudera.com/display/CDH4DOC/Hive+Installation. So I advise you to follow the instructions there, it is well documented.
Probably, it is because wrong Hadoop version you use.
RHive does not work with YARN, then use hadoop-0.20.205.0 or earlier.
I fixed it with fixing rhive_udf.jar classpath (found in RHive source directory, after build)
mkdir –p /usr/lib64/R/library/RHive/java
cp rhive_udf.jar //usr/lib64/R/library/RHive/java
chmod 755 /usr/lib64/R/library/RHive/java/rhive_udf.jar
R
> library("rJava")
> .jinit()
> .jaddClassPath("/usr/lib64/R/library/RHive/java/rhive_udf.jar")
Then test newly added classpath with :
> .jclassPath()
You should see '/usr/lib64/R/library/RHive/java/rhive_udf.jar' in a list!
Then restart R - and here you go!