I'm using the package RSAP to read SAP data.
RSAP loads a SNC (Secure Network Connection) dynamic library and searching for it with the environment variable SNC_LIB.
Depending on the local user system, this might be a 32 or 64 bit library.
I'm setting the environment variable within my R script.
But RSAP still search in the old path.
I try to avoid setting the environment variable outside by application because it's a shiny app which should be used by many users.
It seems that the environment variable is changed only within the RSTUDIO session but not outside.
Initial situation of the environment variables within RStudio console:
> Sys.getenv("SNC_LIB_64")
[1] "C:\\Program Files\\SAP\\FrontEnd\\SecureLogin\\lib\\sapcrypto.dll"
> Sys.getenv("SNC_LIB")
[1] "C:\\Program Files (x86)\\SAP\\FrontEnd\\SecureLogin\\lib\\sapcrypto.dll"
Coding:
# check SNC_LIB path from environment variables
# 32 or 64 bit?
# if 64 bit lib path is set, set the default lib path variable
# SNC_LIB to it
lib_path_64 <- Sys.getenv("SNC_LIB_64")
if (lib_path_64 != "") {
Sys.setenv("SNC_LIB" = lib_path_64)
}
After the code is executed in RStudio debugger:
Browse[2]> Sys.getenv("SNC_LIB")
[1] "C:\\Program Files\\SAP\\FrontEnd\\SecureLogin\\lib\\sapcrypto.dll"
Error thrown by RSAP on loading the library:
[Thr 12160] Wed Jan 03 17:42:57 2018
[Thr 12160] *** ERROR => SncPDLInit()==SNCERR_INIT, Adapter #1 (C:\Program Files (x86)\SAP\FrontEnd\SecureLogin\lib\sapcrypto.dll) not loaded [sncxxdl.c 727]
Old path is used. When I change the path outside before running RStudio it's working.
Question:
Is there a way to set the library path variable SNC_LIB in another way to be sure is globally and not locally changed and RSAP dynamic loading is working well?
Easy way to reproduce is:
Start RStudio
Call Sys.setenv("TEST_VAR" = "good")
Call Sys.getenv("TEST_VAR")
See right result [1] "good"
Close RStudio
Start RStudio again
Call again Sys.getenv("TEST_VAR")
See 'wrong' unexpected result [1] ""
Environment variables set in R affect that process and processes it runs, they don't persist when R quits.
It's not clear what steps you took to lead to your RSAP error, but your "easy code to reproduce" script is acting as expected.
The only way a Sys.setenv() in an R session will affect a subsequent library load is if that load is happening in the R session (e.g. loading an R package that loads the library) or in a process R launches (e.g. running a command using system()).
Related
The following frequently fails in Rstudio but I can't figure out exactly when/why...
reticulate::use_virtualenv("myrpyenv")
pa<-reticulate::import("pyarrow") # this fails
pq<-reticulate::import("pyarrow.parquet") # this fails
pc<-reticulate::import("pyarrow.compute") # this fails
ds<-reticulate::import("pyarrow.dataset") # this fails
adlfs<-reticulate::import("adlfs")$AzureBlobFileSystem #this works
abfs<-adlfs(connection_string=Sys.getenv("Synblob")) #this works
The failure is Error in py_module_import(module, convert = convert) : ImportError: DLL load failed while importing lib: The specified procedure could not be found.
Clearing the environment and Restarting R is insufficient to make it work. I need to spawn new session to make it work. It always works in Rgui (as opposed to Rstudio).
I've unchecked Restore .RData into workspace at startup and changed Save workspace to .RData on exit to Never.
There seems to be something that is hanging around that breaks it but I can't figure out what.
I am running Red Hat Enterprise Linux (RHEL) 8.5 with Linux kernel 4.18 and Gnome 3.32.2. In this system, I've got R 4.1.2 compiled with the tool asdf with shared libraries enabled. On top of that, I installed RStudio 2021.09.01-372 from an RPM from the official RStudio website.
When I start Rstudio, the first line of output after the usual R startup is an error:
Error in tools::startDynamicHelp() : internet routines cannot be loaded
I am unable to figure out what's causing this error, and with it I can't run things like refresh CRAN or update packages. But if I start a pure R session from the terminal (instead of Rstudio) this error does not occur.
Some things I tried:
Install the krb5 and libssh2 packages on my host system: Didn't help.
Starting a "pure" R session (both with and without the --vanilla argument) from the Terminal tab within Rstudio also gives this error. If I try to run update.packages() from this session, it pops up a window to select a CRAN mirror then fails with the following:
Warning: failed to download mirrors file (internet routines cannot be loaded); using local file '/home/[my username]/.asdf/installs/R/4.1.2/lib64/R/doc/CRAN_mirrors.csv'
Warning: unable to access index for repository https://cloud.r-project.org/src/contrib:
internet routines cannot be loaded
Warning message:
In download.file(url, destfile = f, quiet = TRUE) :
unable to load shared object '/home/penyuan/.asdf/installs/R/4.1.2/lib64/R/modules//internet.so':
/lib64/libssh.so.4: undefined symbol: EVP_KDF_ctrl, version OPENSSL_1_1_1b
But like I said, the strange thing is if I start an R session outside of Rstudio, these errors don't happen.
Within RStudio, the only workaround I can find is to run this command upon startup (suggested in this thread):
options(download.file.method="wget")
Once this is done, everything else seems to work, such as package updates.
However, I don't want to manually do this every time I start RStudio. So I tried to put it into ~/.Rprofile including a test print() as follows:
print("This is `~/.Rprofile`")
options(download.file.method="wget")
When I open RStudio, I can see the output from the print() call, but the options() command is not run because the original error shows up again. I still have to manually enter options(download.file.method="wget") every time.
I also tried to fold everything into a .First function in ~/.Rprofile as follows:
.First <- function() {
options(download.file.method="wget")
print("This is the `.First` function in `~/.Rprofile`")
}
Unfortunately, same result as before: print()'s output is seen, but options() is not run.
I also made sure that my ~/.Rprofile includes a trailing newline as discussed here. But this didn't help.
The above are the steps I've tried so far.
Why does this error only occur when running RStudio or a terminal within Rstudio? Why doesn't it happen if I start R from a terminal outside of Rstudio?
Is there a way to solve the problem so that the error doesn't happen in the first place? If it can't be solved, how do I set up my ~/.Rprofile so that options(download.file.method="wget") will be run?
Thank you.
I wanted to make internally sharing/locally launching a shiny app developed with the {golem} framework a little more robust.
Hence, I used the renv package and installed the shiny app as a local package into a project folder.
I proceeded as follows (thanks #Kat for the suggestion):
initialize renv using renv::init(bare = TRUE)
renv::install("my_local_package")
renv::snapshot(type = "all")
renv::isolate()
Writing a launch file consisting of:
library(golempackage)
renv::restore()
golempackage::run_app(options = list(launch.browser = TRUE))
Share folder.
However, when launching the shiny app on a different computer (or a docker testing environment), I get the following error caused by the package bslib. Same happens when I delete my cache:
An error has occurred!
File attachments must exist: 'C:/Users/XYZ/AppData/Local/R/cache/R/renv/cache/v5/.../bslib/lib/bs3/assets/fonts'
Note: this error even occurs if I set the cache to be project-local and share it inside the project folder.
However, now the error message does not reference the global but the project-local cache. Unfortunately still as an absolute path which throws an error for other users.
This is all super weird and I have not the slightest idea why this occurs.
I would like to avoid removing bslib.
As far as I can see, the error is coming from the sass package, e.g.
https://github.com/rstudio/sass/blob/f7a954027447dd0b9826ec01c7084c89a6e64fcc/R/layers.R#L442-L443
While I don't know exactly know what's going on, you could probably use the R debugger to check why that's failing. (Does the referenced folder exist in both cases? Are you expecting renv to be using the cache in the second case?)
Context
I'm developing (for the first time) an R package using Rcpp which implements an interface to another program (maxima). This package defines a C++ class, whose constructor needs to retrieve the path to an initialization script that gets installed with the package (the in-package path is inst/extdata/maxima-init.mac. The path to this script is then used as a parameter to spawn a child process that runs the program.
In order to retrieve the path to the installed initialization script I'm calling the R function base::system.file from within the C++ class constructor definition:
...
Environment env("package:base");
Function f = env["system.file"];
fs::path p(Rcpp::as<std::string>(f("extdata", "maxima-init.mac", Named("package") = "rmaxima")));
std::string utilsDir = p.parent_path().string();
...
# spawn child process using path in utilsDir
My R/zzz.R creates an object of that class when the packages gets attached:
loadModule("Maxima", TRUE)
.onAttach <- function(libname, pkgname) {
"package:base" %in% search()
maxima <<- new(RMaxima)
}
The Problem
I can install.packages(rmaxima) and library(rmaxima) just fine and the package works as expected.
I now want to increase my development efficiency by using devtools::load_all() to avoid having to R CMD build rmaxima, install.packages(rmaxima) and library(rmaxima) each time I want to test changes. However, when calling devtools::load_all() (or similarily devtools::test() (working directory set to package root) my implementation freezes, because the variable utilsDir is empty and therefore the process launching does not return (I guess it keeps waiting for a valid path). I eventually need to manually kill the process. The same thing happens without setting .onAttach()
Apparently devtools::load_all() does not resemble R's default search path on restart. What can I do? Is this the problem or am I missing something else?
Update
I just came across the following notion of in the devtools::load_all() R documentation file which could be a tip in the right direction
Shim files:
‘load_all’ also inserts shim functions into the imports environment of
the loaded package. It presently adds a replacement version of
‘system.file’ which returns different paths from ‘base::system.file’.
This is needed because installed and uninstalled package sources have
different directory structures. Note that this is not a perfect
replacement for base::system.file.
Also I realized, that devtools::load_all() only temporarily installs my package into, but somehow doesn't the files from my inst/
rcst#Velveeta:~$ ls -1R /tmp/RtmpdnvOQg/devtools_install_ee1e82c780/rmaxima/
/tmp/RtmpdnvOQg/devtools_install_ee1e82c780/rmaxima/:
DESCRIPTION
libs
Meta
NAMESPACE
/tmp/RtmpdnvOQg/devtools_install_ee1e82c780/rmaxima/libs:
rmaxima.so
/tmp/RtmpdnvOQg/devtools_install_ee1e82c780/rmaxima/Meta:
features.rds
package.rds
As it turns out devtools provides a solution to exactly this problem.
In short: calling system.file (i.e. from the global environitment and having the devtools package attached) solves the issue. Specifically the modification:
// Environment env("package:base");
// Function f = env["system.file"];
Function f("system.file");
fs::path p(Rcpp::as<std::string>(f("extdata", "maxima-init.mac", Named("package") = "rmaxima")));
std::string utilsDir = p.parent_path().string();
Explanation
base::system.file(..., mustWork = FALSE) returns an empty string if no match is found. devtools::load_all() temporarily installs the packages inside /tmp/ (on my linux machine). The directory structure of the temporary installation differs from the one of the regular installation, i.e. the one created by install.packages(). In my case, most notably, devtools::load_all() does not copy the inst/ directory, which contains the initialization file.
Now, calling base::system.file("maxima-init.mac", package="rmaxima", mustWork=FALSE) naturally fails, since it searches inside the temporary installation. Having devtools attached masks system.file() with devtools::system.file(), which as mentioned above is "... meant to intercept calls to base::sysem.file() " and behaves differently from base::system.file(). Practically, I think this means, that it will search for the package's source directory instead of the temporary installation.
This way, simply calling system.file() from the global environment calls the right function, either from devtools or base, for either the development or user version of the package automatically.
Nonetheless, using ccache additionally (thanks #dirk) substantially speeds up my development workflow.
When running R CMD check I get the following note:
checking for future file timestamps ... NOTE
unable to verify current time
I have seen this discussed here, but I am not sure which files it is checking for timestamps, so I'm not sure which files I should look at. This happens locally on my windows and remotely on different systems (using github actions).
Take a look at https://svn.r-project.org/R/trunk/src/library/tools/R/check.R
The check command relies on an external web resource:
now <- tryCatch({
foo <- suppressWarnings(readLines("http://worldclockapi.com/api/json/utc/now",
warn = FALSE))
This resource http://worldclockapi.com/ is currently not available.
Hence the following happens (see same package source):
if (is.na(now)) {
any <- TRUE
noteLog(Log, "unable to verify current time")
See also references:
https://community.rstudio.com/t/r-devel-r-cmd-check-failing-because-of-time-unable-to-verify-current-time/25589
So, unfortunately this requires a fix in the check function by the R development team ... or the web-resource coming online again.
To add to qasta's answer, you can silence this check by setting the _R_CHECK_SYSTEM_CLOCK_ environment variable to zero e.g Sys.setenv('_R_CHECK_SYSTEM_CLOCK_' = 0)
To silence this in a persistent manner, you can set this environment variable on R startup. One way to do so is through the .Renviron file, in the following manner:
install.packages("usethis") (If not installed already)
usethis::edit_r_environ()
Add _R_CHECK_SYSTEM_CLOCK_=0 to the file
Save, close file, restart R