RStudio cannot find any package after laptop restart - r

My R script worked fine in RStudio (Version 0.98.1091) on Windows 7. Then I restarted my laptop, entered again in RStudio and now it provides the following error messages each time I want to execute my code:
cl <- makeCluster(mc); # build the cluster
Error: could not find function "makeCluster"
> registerDoParallel(cl)
Error: could not find function "registerDoParallel"
> fileIdndexes <- gsub("\\.[^.]*","",basename(SF))
Error in basename(SF) : object 'SF' not found
These error messages are slightly different each time I run the code. It seems that RStudio cannot find any function that is used in the code.
I restarted R Session, cleaned Workspace, restarted RStudio. Nothing helps.
It must be noticed that after many attempts to execute the code, it finally was initialized. However, after 100 iterations, it crashed with the message related to unavailability of localhost.

Add library(*the package needed/where the function is*) for each of the packages you're using.

Related

R: "internet routines cannot be loaded" when starting from RStudio

I am running Red Hat Enterprise Linux (RHEL) 8.5 with Linux kernel 4.18 and Gnome 3.32.2. In this system, I've got R 4.1.2 compiled with the tool asdf with shared libraries enabled. On top of that, I installed RStudio 2021.09.01-372 from an RPM from the official RStudio website.
When I start Rstudio, the first line of output after the usual R startup is an error:
Error in tools::startDynamicHelp() : internet routines cannot be loaded
I am unable to figure out what's causing this error, and with it I can't run things like refresh CRAN or update packages. But if I start a pure R session from the terminal (instead of Rstudio) this error does not occur.
Some things I tried:
Install the krb5 and libssh2 packages on my host system: Didn't help.
Starting a "pure" R session (both with and without the --vanilla argument) from the Terminal tab within Rstudio also gives this error. If I try to run update.packages() from this session, it pops up a window to select a CRAN mirror then fails with the following:
Warning: failed to download mirrors file (internet routines cannot be loaded); using local file '/home/[my username]/.asdf/installs/R/4.1.2/lib64/R/doc/CRAN_mirrors.csv'
Warning: unable to access index for repository https://cloud.r-project.org/src/contrib:
internet routines cannot be loaded
Warning message:
In download.file(url, destfile = f, quiet = TRUE) :
unable to load shared object '/home/penyuan/.asdf/installs/R/4.1.2/lib64/R/modules//internet.so':
/lib64/libssh.so.4: undefined symbol: EVP_KDF_ctrl, version OPENSSL_1_1_1b
But like I said, the strange thing is if I start an R session outside of Rstudio, these errors don't happen.
Within RStudio, the only workaround I can find is to run this command upon startup (suggested in this thread):
options(download.file.method="wget")
Once this is done, everything else seems to work, such as package updates.
However, I don't want to manually do this every time I start RStudio. So I tried to put it into ~/.Rprofile including a test print() as follows:
print("This is `~/.Rprofile`")
options(download.file.method="wget")
When I open RStudio, I can see the output from the print() call, but the options() command is not run because the original error shows up again. I still have to manually enter options(download.file.method="wget") every time.
I also tried to fold everything into a .First function in ~/.Rprofile as follows:
.First <- function() {
options(download.file.method="wget")
print("This is the `.First` function in `~/.Rprofile`")
}
Unfortunately, same result as before: print()'s output is seen, but options() is not run.
I also made sure that my ~/.Rprofile includes a trailing newline as discussed here. But this didn't help.
The above are the steps I've tried so far.
Why does this error only occur when running RStudio or a terminal within Rstudio? Why doesn't it happen if I start R from a terminal outside of Rstudio?
Is there a way to solve the problem so that the error doesn't happen in the first place? If it can't be solved, how do I set up my ~/.Rprofile so that options(download.file.method="wget") will be run?
Thank you.

Error when running any R code in R markdown

Error: no more error handlers available (recursive errors ) invoking 'abort' restart
Error: option error has NULL value
whenever I try running any piece of code in R markdown. I didn't have any issues using R markdown last week. The only thing between then and now is that I ran a lot of data analysis on very large data frames(16M points) in another R script, which gave multiple warnings() which I ignored. Does that have anything to do with it? I quit Rstudio, tried restarting, tried clearing garbage with gc() but nothing works. Can't find much on this error on google.

rstudio - removing the memory limit

I’m trying to run some modeling (random forest, using caret) in rstudio server 1.1.423 (with R version 3.4.4, running on an Ubuntu 16.04 server), and it comes back with the following error:
Error: protect(): protection stack overflow
This error doesn't come up if I run the same analysis in an interactive R session. I seem to recall that in the past (in rstudio server running an older version of R) was able to resolve this error by issuing memory.limit(500000) in an interactive rstudio server session, but these days this comes back with:
> memory.limit(500000)
[1] Inf
Warning message:
'memory.limit()' is Windows-specific
A solution that works and that I use routinely is to run my analysis from a script, like Rscript --max-ppsize=500000 --vanilla /location/of/the/script.R, but that’s not what I want to do, as in this particular case I need to run the analysis interactively.
I’ve also tried adding R_MAX_VSIZE=500000 at the end of my ~/.profile, or rsession-memory-limit-mb=500000 into /etc/rstudio/rserver.conf, as well as putting options(expressions = 5e5) in my ~/.Rprofile, or running options(expressions = 5e5) in an interactive rstudio server session. No luck so far, the “protect()” error keeps on popping up.
Any ideas as to how to remove the memory limit in rstudio server?

Running R code on linux in parallel on computing cluster

I've recently converted my windows R code to a Linux installation for running DEoptim on a function. On my windows system it all worked fine using:
ans <- DEoptim1(Calibrate,lower,upper,
DEoptim.control(trace=TRUE,parallelType=1,parVAr=parVarnames3,
packages=c("hydromad","maptools","compiler","tcltk","raster")))
where the function 'Calibrate' consisted of multiple functions. On the windows system I simply downloaded the various packages needed into the R library. The option paralleType=1 ran the code across a series of cores.
However, now I want to put this code onto a Linux based computing cluster - the function 'Calibrate' works fine when stand alone, as does DEoptim if I want to run the code on one core. However, when I specify the parelleType=1, the code fails and returns:
Error in checkForRemoteErrors(lapply(cl, recvResult)) :
7 nodes produced errors; first error: there is no package called ‘raster’
This error is reproduced whatever package I try and recall, even though the
library(raster)
command worked fine and 'raster' is clearly shown as okay when I call all the libraries using:
library()
So, my gut feeling is, is that even though all the packages and libraries are loaded okay, it is because I have used a personal library and the packages element of DEoptim.control is looking in a different space. An example of how the packages were installed is below:
install.packages("/home/antony/R/Pkges/raster_2.4-15.tar.gz",rpeo=NULL,target="source",lib="/home/antony/R/library")
I also set the lib paths option as below:
.libPaths('/home/antony/R/library')
Has anybody any idea of what I am doing wrong and how to set the 'packages' option in DEoptim control so I can run DEoptim across multiple cores in parallel?
Many thanks, Antony

"Cannot open the connection" - HPC in R with snow

I'm attempting to run a parallel job in R using snow. I've been able to run extremely similar jobs with no trouble on older versions of R and snow. R package dependencies prevent me from reverting.
What happens: My jobs terminate at the parRapply step, i.e., the first time the nodes have to do anything short of reporting Sys.info(). The error message reads:
Error in checkForRemoteErrors(val) :
3 nodes produced errors; first error: cannot open the connection
Calls: parRapply ... clusterApply -> staticClusterApply -> checkForRemoteErrors
Specs: R 2.14.0, snow 0.3-8, RedHat Enterprise Linux Client release 5.6. The snow package has been built on the correct version of R.
Details:
The following code appears to execute fine:
cl <- makeCluster(3)
clusterEvalQ(cl,library(deSolve,lib="~/R/library"))
clusterCall(cl,function() Sys.info()[c("nodename","machine")])
I'm an end-user, not a system admin, but I'm desperate for suggestions and insights into what could be going wrong.
This cryptic error appeared because an input file that's requested during program execution wasn't actually present. Each node would attempt to load this file and then fail, but this would result only in a "cannot open the connection" message.
What this means is that almost anything can cause a "connection" error. Incredibly annoying!

Resources