Now I have two Linux Distribution in two different partitions. I have a Data partition that is shared between them so they can use common files and folders. I have the same (major) version of R in both distributions. My question is:
Can I use a common R-package path so that I just need to install R-packages in one and can use in other?
What possible problems can I face in the situation?
Yes, you do.
Example the brew and conda which create a directory to all bins and libs installed with those packages.
So, consider using one of them. Anyway, you might include the binaries on ENV using the var PATH.
export PATH="my/binary/path:"$PATH
Additionally, you might prefer edit both installations .bashrc or .bash_profile adding a line to edit PATH.
Particularly, I like to create a .bashrc/ directory and include configuration files within (mypath.sh, myalias.sh, myfunctions.sh, ...) and call execution of all files with the directory just including a line on the bottom of the .bashrc file a line like this:
for file in ~/.bashrc.d/*;
do
source $file
done
This might work for packages with only R code. For packages with compiled code I do expect problems:
Do both Linux distributions use the same linker and compiler?
Do both Linux distributions use the same system libraries?
Related
I am trying to use cl-sql for database access to sqlite3.
But I am getting the error
Couldn't load foreign libraries "libsqlite3", "sqlite3". (searched CLSQL-SYS:*FOREIGN-LIBRARY-SEARCH-PATHS*: (#P"/usr/lib/clsql/" #P"/usr/lib/"))
The same is with sqlite.
I have installed sqlite3 using apt-get and there is a file libsqlite.so.0 in /usr/lib directory.
I also tried to build sqlite3 from source but I couldn't get the so file. What is that I am doing wrong?
Your problem is that cl-sql has a third party dependency. If you inspect the implementation of cl-sql (probably under "~/quicklisp/dists/quicklisp/software/clsql-202011220-git/db-sqlite3/sqlite3-loader.lisp") you will see that the function database-type-load-foreign is trying to load a library named either "libsqlite3" or "sqlite3".
Depending on your operating system this is either looking for a .dll or .so with exactly one of those names.
Given that the version of of libsqlite.so has a different name on your particular distribution of linux, you have a number of different options to make this library work.
Install a version of sqlite3 with the correct binary
Create a soft link to your binary that redirects via ln -s /usr/lib/libsqlite.so.0 /usr/lib/libsqlite3.so (assuming libsqlite.so.0 is the file that clsql is looking for)
Add new paths to CLSQL-SYS:*FOREIGN-LIBRARY-SEARCH-PATHS* to point to the correct binary if it is installed elsewhere (via clsql:push-libary-path)
I'm trying to set up an easy to use R development environment for multiple users. R is installed along with a set of other dev tools on an NFS mount.
I want to create a core set of R packages that also live on NFS so n users don't need to install their own copies of the same packages n times. Then, I was hoping users can install one off packages to a local R library. Has anyone worked with an R setup like this before? From the doc, it looks doable by adding both the core package and personal package file paths to .libPaths().
You want to use the .Renviron file (see ?Startup).
There are three places to put the file:
Site wide in R_HOME/etc/Renviron.site
Local in either the current working directory or the home area
In this file you can specify R_LIBS and the R_LIBS_SITE environment variables.
For your particular problem, you probably want to add the NFS drive location to R_LIBS_SITE in the R_HOME/etc/Renviron.site file.
## To get R_HOME
Sys.getenv("R_HOME")
I use a cluster (OS is Linux) which does not have R. I would like to install R in my personal folders so that I can just do
Rscript example.R arg1 arg2
How should I install R on this cluster knowing that I don't have admin rights?
How can I then manage the packages?
I'm not sure this is on-topic, but: all you really have to do is
download the R source tarball from CRAN; unpack it somewhere in your file space
create an r-build directory at the same level of the hierarchy (not technically necessary, but it's better practice to keep the source and build directories separate)
create an installation directory (say ~/r_install) somewhere sensible within your file space
cd to the source directory; tools/rsync-recommended
cd to the build directory
../[srcdir]/configure --prefix=~/r_install
make (to build the binaries)
make install (to move everything where it belongs; not technically necessary, as you can run R from the build directory)
Where this may get hairy is with all of the system requirements for R (LaTeX, Java, bzip2, etc. etc. ...) it is theoretically possible to download all this stuff and install it in your own file space, but it starts to get sufficiently tedious that it will be easier to beg your sysadmin to install at least the dependencies for you ...
as #Hack-R points out the basics of this answer are already present on Unix & Linux stackexchange, although my answer is a little more detailed ...
I'm trying to develop an R package that will include some previously compiled executable programs and their supporting libraries. (I know this is bad form, but it is for internal use).
My question: Does the special exec and tools directories have any special functionality within R?
The documentation seems to be sparse. Here is what I've figured out so far:
From here
files contained in exec are marked as executable on install
subdirectories in exec are ignored
exec is rarely used (my survey of CRAN says tools is just as rarely used)
tools is around for configuration purposes?
Do these directories offer any that I couldn't get from creating an inst/programs directory?
[R-exts] has this to say:
Subdirectory exec could contain additional executable scripts the
package needs, typically scripts for interpreters such as the shell,
Perl, or Tcl. This mechanism is currently used only by a very few
packages. NB: only files (and not directories) under exec are
installed (and those with names starting with a dot are ignored), and
they are all marked as executable (mode 755, moderated by ‘umask’) on
POSIX platforms. Note too that this is not suitable for executable
programs since some platforms (including Windows) support multiple
architectures using the same installed package directory.
It's quite possible the last note won't apply to you if it's only for internal use.
Nevertheless, I'd suggest avoiding abusing any existing convention that might not apply precisely to your situation, and instead use inst/tools or inst/bin.
As far as I can tell, here is the functionality offered by the exec and tools directories.
exec
From R-exts by way of hadley:
Subdirectory exec could contain additional executable scripts the package needs, typically scripts for interpreters such as the shell, Perl, or Tcl. This mechanism is currently used only by a very few packages. NB: only files (and not directories) under exec are installed (and those with names starting with a dot are ignored), and they are all marked as executable (mode 755, moderated by ‘umask’) on POSIX platforms. Note too that this is not suitable for executable programs since some platforms (including Windows) support multiple architectures using the same installed package directory.
exec features I have figured out
On POSIX platforms (*nix, os x), the files within exec will be marked as executable.
No subdirectories of exec are included in the package, only files in exec root
(note, it could contain binary executables, but there is no architecture/platform handling
tools
From R-exts:
Subdirectory tools is the preferred place for auxiliary files needed during configuration, and also for sources need to re-create scripts (e.g. M4 files for autoconf).
tools features I have figured out
tools is to hold files used at package compile time
All files contained are copied recursively into the source *.tar.gz package (including subdirs)
tools is not included in the final, compiled form of the package. All contents are dropped
I have been doing research and I can't quite figure out how to build my R package, that calls C functions, in order for it to work in both Windows and Linux environments. I am building the package on a Linux machine.
I have two C files, one.C and two.C, I place the two files in the src directory after using package.skeleton(...). In the namespace file I use the command: useDynLib(one,two). Is this correct? Or do I need to put the actual function names instead of the file names? Do I need to export the function names?
Do I need to put the .so files in the src directory or will these be created automatically? I am worried then it won't work on a windows machine which needs a .dll file.
As you can see I'm a little confused, thanks for the help.
One of the standard R manuals is Writing R Extensions. Part of this manual is the section 5 System and foreign language interfaces. This will probably answer the majority of your questions. In regard to the dynamically linked libraries (dll or so), they are built on the fly. You develop your package, including the C code. Once you want to install the library from source (e.g. using R CMD INSTALL spam), or create a binary distribution, the C code will be compiled into the appropriate library file.
Faced with similar headaches I switched to C++ in combination with Rcpp. Rcpp takes care of all the headaches for you in compiling packages:
http://dirk.eddelbuettel.com/code/rcpp.html
There is also an entire vignette on how to build a package using Rcpp:
http://dirk.eddelbuettel.com/code/rcpp/Rcpp-package.pdf