I work in an environment where linking of dynamic libraries are restricted to certain locations. When I use RStudio and request a new C++ file I get the "Hello World" template. When I try to compile that and link that in by clicking on "Source" in RStudio, I get an error:
LoadLibrary failure: Access is denied.
This error is because the library was located in a space which is not allowed to be able to load DLL files. To maneuver around this limitation, I would like to determine how to tell RCpp to place the temporary dll's (not in a package) in a specific location.
I know that Dirk has suggested that this is not in the scope of RCpp and that all code should live in packages, but that will not be he most user friendly environment for the users here. I suspect that most will use RStudio projects with GIT.
So, that being said, is there an environment variable that I can mangle to get RCpp to place temporary dll files in a specific place. Or is there some other mechanism which I can use to alter this?
Try setting TMPDIR which R respects. This is indeed not an Rcpp issue but a generic R CMD build / R CMD INSTALL issue.
From help(tempfile):
The environment variables TMPDIR, TMP and TEMP are checked in
turn and the first found which points to a writable directory is
used: if none succeeds /tmp is used.
PS Rcpp with lower-case C.
Related
After having set the path for the default working directory as well as my first (and only) project within RStudio options I wonder why RStudio keeps creating an empty folder named "R" within my "/home" directory every time it is started.
Is there any file I could delete/edit (eventually create) to stop this annoying behaviour and if so, where is it located ?
System: Linux Mint v. 19.3
Software: RStudio v. 1.3.959 / R version 3.4.4
Thanks in advance for any hints.
Yes, you can prevent the creation of the R directory — R is configurable via a set of environment variables.
However, setting these correctly isn’t trivial. The first issue is that many R packages are sensitive to the R version they’re installed with. If you upgrade R and try to load the existing package, it may break. Therefore, the R package library path should be specific to the R version.
On clusters, an additional issue is that the same library path might be read by various cluster nodes that run on different architectures; this is rare, but it happens. In such cases, compiled R packages might need to be different depending on the architecture.
Consequently, in general the R library path needs to be specific both to the R version and the system architecture.
Next, even if you configure an alternative path R will silently ignore it if it doesn’t exist. So be sure to manually create the directory that you’ve configured.
Lastly, where to put this configuration? One option would be to put it into the user environment file, the path of which can be specified with the environment variable R_ENVIRON_USER — it defaults to $HOME/.Renviron. This isn’t ideal though, because it means the user can’t temporarily override this setting when calling R: variables in this file override the calling environment.
Instead, I recommend setting this in the user profile (e.g. $HOME/.profile). However, when you use a desktop launcher to launch your RStudio, this file won’t be read, so be sure to edit your *.desktop file accordingly.1
So in sum, add the following to your $HOME/.profile:
export R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v
And make sure this directory exists: re-source ~/.profile (launching a new shell inside the current one is not enough), and execute
mkdir -p "$(Rscript -e 'cat(Sys.getenv("R_LIBS_USER"))')"
The above is using the XDG base dir specification, which is the de-facto standard on Linux systems.2 The path is using the placeholders %p and %v. R will fill these in with the system platform and the R version (in the form major.minor), respectively.
If you want to use a custom R configuration file (“user profile”) and/or R environment file, I suggest setting their location in the same way, by configuring R_PROFILE_USER and R_ENVIRON_USER (since their default location, once again, is in the user home directory):
export R_PROFILE_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/rprofile
export R_ENVIRON_USER=${XDG_CONFIG_HOME:-$HOME/.config}/R/renviron
1 I don’t have a Linux desktop system but I believe that editing the Env entry to the following should do it:
Exec=env R_LIBS_USER=${XDG_DATA_HOME:-$HOME/.local/share}/R/%p-library/%v /path/to/rstudio
2 Other systems require different handling. On macOS, the canonical setting for the library location would be $HOME/Library/Application Support/R/library/%v. However, setting environment variables on macOS for GUI applications is frustratingly complicated.
On Windows, the canonical location is %LOCALAPPDATA%/R/library/%v. To set this variable, use [Environment]::SetEnvironmentVariable in PowerShell or, when using cmd.exe, use setx.
I want to optimize my process for building a package. I have in pckgname/src some fortran code (f90):
pckgname/src/FortranFile1.f90
pckgname/src/FortranFile2.f90
I am under RStudio. When I build the package, it creates the src-i386 and src-x64 folders, inside which executable files in .o are produced
pckgname/src-i386/FortranFile1.o
pckgname/src-i386/FortranFile2.o
pckgname/src-x64/FortranFile1.o
pckgname/src-x64/FortranFile2.o
then dll files are produced into each of these folders from the .o files:
pckgname/src-i386/dllname.dll
pckgname/src-x64/dllname.dll
thereafter if I want to check the code successfully, I need to manually copy paste the dll into these two folders (in the previous version of the question i wrote code instead of dll which might have led to misunderstandings)
pckgname/inst/libs/x64/dllname.dll
pckgname/libs/X64/dllname.dll
My question is: is it normal that I have to do this or is there a shorter way without having to copy paste by hand dllname.dll
into these two folders? It could be indeed a source of error.
NB: If i don't copy the dlls into the said folders I get the following error messages (translated from the French):
Error in inDL(x, as.logical(local), as.logical(now), ...) :
impossible to load shared object 'C:/Users/username/Documents/pckgname/inst/libs/x64/dllname.dll':
LoadLibrary failure: The specified module can't be found
Error in inDL(x, as.logical(local), as.logical(now), ...) :
impossible to load shared object 'C:/Users/username/Documents/pckgname/libs/x64/dllname.dll':
LoadLibrary failure: The specified module can't be found.
[...]
`cleanup` is deprecated
The short answer
Is it normal that I have to do this?
No. If path/to/package is the directory you are developing your package in, and you have everything set up for your package to call your Fortran subroutines correctly (see "The long answer"), you can run
R CMD build path/to/package
at the command prompt, and a tarball will be constructed for you with everything in the right place (note you will need Rtools for this). Then you should be able to run
R CMD check packagename_versionnumber.tar.gz
from the command prompt to check your package, without any problems (stemming from the .dll files being in the wrong place -- you may have other problems, in which case I would suggest asking a new question with the ERROR, WARNING, or NOTE listed in the question).
If you prefer to work just from R, you can even
devtools::check("path/to/package")
without having to run devtools::build() or R CMD build ("devtools::check()... [b]undles the package before checking it" -- Hadley's chapter on checking; see also Karl Broman's chapter on checking).
The long answer
I think your question has to do with three issues potentially:
The difference between directory structure of packages before and after they're installed. (You may want to read the "What is a package?" section of Hadley's Package structure chapter -- luckily R CMD build at the command prompt or devtools::build() in R will take care of that for you)
Using INSTALL vs. BUILD (from the comments to the original version of this answer)
The proper way to set up a package to call Fortran subroutines.
You may need quite a bit of advice on the process of developing R packages itself. Some good guides include (in increasing order of detail):
Karl Broman's R package primer
Hadley Wickham's R packages
The Writing R Extensions manual
In particular, there are some details about having compiled code in an R package that you may want to be aware of. You may want to first read Hadley's chapter on compiled code (Broman doesn't have one), but then you honestly need to read most of the Writing R Extensions manual, in particular sections 1.1, 1.2, 1.5.4, and 1.6, and all of chapters 5 and 6.
In the mean time, I've setup a GitHub repository here that demonstrates a toy example R package FortranExample that shows how to correctly setup a package with Fortran code. The steps I took were:
Create the basic package structure using devtools::create("FortranExample").
Eliminate the "Depends" line in the DESCRIPTION, as it set a dependence on R >= 3.5.1, which will throw a warning in check (I have now also revised the "License" field to eliminate a warning about not specifying a proper license).
Make a src/ directory and add toy Fortran code there (it just doubles a double value).
Use tools::package_native_routine_registration_skeleton("FortranExample") to generate the symbol registration code that I placed in src/init.c (See Writing R Extensions, section 5.4).
Create a nice R wrapper for using .Fortran() to call the Fortran code (placed in R/example_function.R).
In that same file use the #' #useDynLib FortranExample Roxygen tag to add useDynLib(FortranExample) to the NAMESPACE file; if you don't use Roxygen, you can put it there manually (See Writing R Extensions 1.5.4 and 5.2).
Now we have a package that's properly set up to deal with the Fortran code. I have tested on a Windows machine (running Windows 8.1 and R 3.5.1) both the paths of running
R CMD build FortranExample
R CMD check FortranExample_0.0.0.9000.tar.gz
from the command prompt, and of running
devtools::check("FortranExample")
from R. There were no errors, and the only warning was the "License" issue mentioned above.
After cleaning up the after-effects of running devtools::check("FortranExample") (for some reason the cleanup option is now deprecated; see below for an R function to handle this for you inspired by devtools::clean_dll()), I used
devtools::install("FortranExample")
to successfully install the package and tested its function, getting:
FortranExample::example_function(2.0)
# [1] 4
The cleanup function I mentioned is
clean_source_dirs <- function(path) {
paths <- file.path(path, paste0("src", c("", "-i386", "-x64")))
file_pattern <- "\\.o|so|dll|a|sl|dyl"
unlink(list.files(path = paths, pattern = file_pattern, full.names = TRUE))
}
No, it is not normal and there is a solution to this problem. Make use of Makevars.win. The reason for your problem is that .dlls are looking for dependencies in places defined by environment variable PATH and relative paths defined during the linking. Linking is being done when running the command R CMD INSTALL as it is stated in Mingw preferences plus some custom parameters defined in the file Makevars.win (Windows platform dependent). As soon as the resulting library is copied, the binding to the places where dependent .dlls were situated may become broken, so if you put dlls in a place where typically dependent libraries reside, such as, for instance, $(R_HOME)/bin/$(ARCH)/,
cp -f <your library relative path>.dll $(R_HOME)/bin/$(ARCH)/<your library>.dll
during the check R will be looking for your dependencies specifically there too, so you will not miss the dependencies. Very crude solution, but it worked in my case.
Is there any documentation on manually installing a package in a user library when the R.home() path is locked down and incomplete (no etc, no bin, just library?) The system does NOT support shelling out to execute R CMD, which I believe standard R does.
I would like to build existing source packages (from CRAN) and install into a user library directory, so that I can use the library() function and get all the usual namespace and *.Rdx and *.Rdb files.
At the moment, I'm plodding through install.packages, tools::.build_package, and tools:::.install.packages source, using a standard MacOS R and the r source. Hopefully this has been documented in a more user-friendly fashion and my google searches have missed it.
Thanks.
You don't need to use a different install.packages method, rather you only need to specify a writable location for storing packages and give it precedence over the system default one. A simple way to accomplish this is to set an R_LIBS environment variable. For instance, in my .bashrc I have
export R_LIBS='/home/username/.local/lib/R-3.3.3'
Then, by default, all packages are installed here. Further, packages installed both here and the system-wide location will give priority to the ones here when loading.
You can verify that the location is being used by checking .libPaths() in your R session.
I have read the R FAQS and other posts but I am a bit confused and would be grateful to know whether I did everything correctly.
In Windows, in order to modify the default library folder I created a file Renviron.site and put inside E:/Programs/R-3.3.0/etc.
The file has only one line saying
R_LIBS=E:/Rlibrary
When I open R and run .libPaths() I see E:/Rlibrary as [1] and the default R library E:/Programs/R-3.3.0/library as [2].
This should mean that from now on all packages I will install will go in E:/Rlibrary but at the same time I will be able to load and use both packages in this folder and those in the default location. Am I correct?
When you load a package via library, it will go through each directory in .libPaths() in turn to find the required package. If the package hasn't been found, you will get an error. This means you can have multiple versions of a package (in different directories), but the package that will be used is determined by the order of .libPaths().
Regarding how .libPaths() is constructed, from ?.R_LIBS
The library search path is initialized at startup from the
environment variable 'R_LIBS' (which should be a colon-separated
list of directories at which R library trees are rooted) followed
by those in environment variable 'R_LIBS_USER'. Only directories
which exist at the time will be included.
I have attempted to install R and R studio on the local drive on my work computer as opposed to the organization network folder because anything that runs through the network is really slow. When installing, the destination path shows that it's my local C:drive. However, when I install a new package, the default path shown is my network drive and there is no option to change:
.libPaths()
[1] "\\\\The library/path/I/don't/want"
[2] "C:/Program Files/R/R-3.2.1/library"
I'm running windows 7 professional. How can I remove library path [1] and make path [2] my primary for all base packages and all new packages that I install?
Windows 7/10: If your C:\Program Files (or wherever R is installed) is blocked for writing, as mine is, then you'll get frustrated editing RProfile.site (as I did). As specified in the accepted answer, I updated R_LIBS_USER and it worked. However, even after reading the fine manual several times and extensive searching, it took me several hours to do this. In the spirit of saving someone else time...
Let's assume you want your packages to reside in C:\R\Library:
Create the folder C:\R\Library. Next I need to add this folder to the R_LIBS_USER path:
Click Start --> Control Panel --> User Accounts --> Change my environmental variables
The Environmental Variables window pops up. If you see R_LIBS_USER, highlight it and click Edit. Otherwise click New. Both actions open a window with fields for Variable and Value.
In my case, R_LIBS_USER was already there, and Value was a path to my desktop. I added to the path the folder that I created, separated by semicolon. C:\R\Library;C:\Users\Eric.Krantz\Desktop\R stuff\Packages.
(NOTE: In the last step, I could have removed the path to the Desktop location and simply left C:\R\Library).
See help(Startup) and help(.libPaths) as you have several possibilities where this may have gotten set. Among them are
setting R_LIBS_USER
assigning .libPaths() in .Rprofile or Rprofile.site
and more.
In this particular case you need to go backwards and unset whereever \\\\The library/path/I/don't/want is set.
To otherwise ignore it you need to override it use explicitly i.e. via
library("somePackage", lib.loc=.libPaths()[-1])
when loading a package.
Facing the very same problem (avoiding the default path in a network) I came up to this solution with the hints given in other answers.
The solution is editing the Rprofile file to overwrite the variable R_LIBS_USER which by default points to the home directory.
Here the steps:
Create the target destination folder for the libraries, e.g.,
~\target.
Find the Rprofile file. In my case it was at C:\Program Files\R\R-3.3.3\library\base\R\Rprofile.
Edit the file and change the definition the variable R_LIBS_USER. In my case, I replaced the this line file.path(Sys.getenv("R_USER"), "R", with file.path("~\target", "R",.
The documentation that support this solution is here
Original file with:
if(!nzchar(Sys.getenv("R_LIBS_USER")))
Sys.setenv(R_LIBS_USER=
file.path(Sys.getenv("R_USER"), "R",
"win-library",
paste(R.version$major,
sub("\\..*$", "", R.version$minor),
sep=".")
))
Modified file:
if(!nzchar(Sys.getenv("R_LIBS_USER")))
Sys.setenv(R_LIBS_USER=
file.path("~\target", "R",
"win-library",
paste(R.version$major,
sub("\\..*$", "", R.version$minor),
sep=".")
))
Windows 10 on a Network
Having your packages stored on the network drive can slow down the performance of R / R Studio considerably, and you spend a lot of time waiting for the libraries to load/install, due to the bottlenecks of having to retrieve and push data over the server back to your local host. See the following for instructions on how to create an .RProfile on your local machine:
Create a directory called C:\Users\xxxxxx\Documents\R\3.4 (or whatever R version you are using, and where you will store your local R packages- your directory location may be different than mine)
On R Console, type Sys.getenv("HOME") to get your home directory (this is where your .RProfile will be stored and R will always check there for packages- and this is on the network if packages are stored there)
Create a file called .Rprofile and place it in :\YOUR\HOME\DIRECTORY\ON_NETWORK (the directory you get after typing Sys.getenv("HOME") in R Console)
File contents of .Rprofile should be like this:
#search 2 places for packages- install new packages to first directory- load built-in packages from the second (this is from your base R package- will be different for some)
.libPaths(c("C:\Users\xxxxxx\Documents\R\3.4", "C:/Program Files/Microsoft/R Client/R_SERVER/library"))
message("*** Setting libPath to local hard drive ***")
#insert a sleep command at line 12 of the unpackPkgZip function. So, just after the package is unzipped.
trace(utils:::unpackPkgZip, quote(Sys.sleep(2)), at=12L, print=TRUE)
message("*** Add 2 second delay when installing packages, to accommodate virus scanner for R 3.4 (fixed in R 3.5+)***")
# fix problem with tcltk for sqldf package: https://github.com/ggrothendieck/sqldf#problem-involvling-tcltk
options(gsubfn.engine = "R")
message("*** Successfully loaded .Rprofile ***")
Restart R Studio and verify that you see that the messages above are displayed.
Now you can enjoy faster performance of your application on local host, vs. storing the packages on the network and slowing everything down.
I was struggling for a while with this as my work computer (with Windows 10) created the default user library on a network drive, which would slow down R and RStudio to an unusable state.
In case this helps someone, this is the easiest way I found, without requiring admin rights:
make sure the directory you want to install your packages into exists. If you want to respect the convention, use: C:\Users\username\R\win-library\rversion (for example, something like: C:\Users\janebloggs\R\win-library\3.6)
create a .Renviron file in your home directory (which might be on the network drive?), and in it, write one single line that defines the R_LIBS_USER variable to be your custom path:
R_LIBS_USER=C:\Users\janebloggs\R\win-library\3.6
(feel free to add comments too, with lines starting with #)
If a .Renviron file exists, R will read it at startup and use the variables as they are defined in there, before running the code in the .Rprofile. You can read about it in help(Startup).
Now it should be persistent between sessions!
After a couple of hours of trying to solve the issue in several ways, some of which are described here, for me (on Win 10) the option of creating a Renviron file worked, but a little different from what was written here above.
The task is to change the value of the variable R_LIBS_USER. To do this two steps needed:
Create the file named Renviron (without dot) in the folder \Program\etc\ (Program is the directory where R is installed--for example, for me it was C:\Program Files\R\R-4.0.0\etc)
Insert a line in Renviron with new path: R_LIBS_USER = "C:/R/Library"
After that, reboot R and use .libPaths() to confirm the default directory changed.
I think I tried all of the above and it didn't work for me. This worked, though:
In home directory, make a file called ".Renviron"
In that file, write:
.libPaths(new = "/my/path/to/libs")
Save and restart R if you had it open