r packrat unbundle not recreating libraries - r

I am having trouble understanding what I am doing wrong when deploy a shiny application on shiny server using packrat to managing the libraries.
I create project test-deploy in RStudio, then initiate packrat with.
packrat::init()
As I work I am installing packages (dplyr, ggplot2, etc). These are stored in /test-deploy/packrat/lib/[OS]/[R Version] . All good so far.
Done working ready to deploy.
packrat::bundle()
Creates a tar file which is unbundled to shiny server with
packrat::unbundle("/test-deploy/packrat/bundles/test-deploy-2017-07-14.tar.gz", "/srv/shiny-server/")
I go to that app on shiny-server and make turn packrat on
packrat::on()
Now I check what packages are installed other than the base packages with
ip = as.data.frame(installed.packages()[,c(1,3:4)])
ip = ip[is.na(ip$Priority),1:2,drop=FALSE]
ip
Output
Package Version
packrat packrat 0.4.8-1
The other packages which I can see in the development version are not there? What am I doing wrong ?

I forgot to include
packrat::snapshot()
before
packrat::bundle()

Related

renv paths - empty libraries when sharing R project which uses renv package

I'm using the 'renv' R package in an RStudio project to control/lock the package versions used by my script. The libraries sit in the project directory under ... renv\library\R-4.1\x86_64-w64-mingw32. I'm using R version 4.1.3 and renv 0.15.5. When this directory is copied to a colleague's machine (using memory stick) the libraries in the directory mentioned above are blank. I'm assuming these libraries are just pointers to where R saves packages (e.g. "C:/Program Files/R/R-4.1.3/library") and my colleague doesn't have these packages on their machine.
Is there a way to include the packages themselves when sharing the RStudio Project directory?
By default, packages within the renv project directory are symlinked from a global cache location. If you want to ensure packages are instead stored locally in the project library, you can use renv::isolate().
See https://rstudio.github.io/renv/reference/isolate.html for more details.

Air-gapped env- Installing R package source vs binaries

We have a ubuntu linux server in our office which is a air-gapped environment. There is no internet access to external network.
However I would like to install few R packages like ggplot2, Database Connector, dplyr, Tidyverse etc. I have more than 10-15 packages to download
While I cannot write the usual command install.packages("DatabaseConnector"), I have to download the zipped folders from CRAN as shown here.
I am new to R. So, can you help me with my questions given below?
a) Why is there are no files for linux systems? I only see windows binaries and macOS binaries. Which one should I download?
b) Should I download binaries or package source? which one is easy to install?
c) When I download packages like above as zipped file from CRAN like shown here, will the dependencies be automatically downloaded as well? Or should I look at error messages and keep downloading them one by one?
d) Since I work in a Air-gapped environment, what would be the best way to do this process efficiently.
Under linux packages are always installed from source. There are no official binary packages for linux. However, your distro might offer some of them in the official repositories. Ubuntu does. However these tend to be quite old versions and usually limited to a handfull of the most important packages. So, for linux you have to download the source packages. The zip files are for windows and will not work.
You will also need to download all of the dependencies of the packages. For something like tidyverse this will be a huge number. Tracking those by hand is a lot of work. Easiest is probably to use a package like miniCRAN outside of your airgapped system to build a selective copy of CRAN. You can specify the packages you want and the package will download all dependencies. You can then copy the downloaded directories to your server, point install.packages in the right direction and install as usually using install.packages. For details see https://andrie.github.io/miniCRAN/articles/miniCRAN-introduction.html.
You might also run into the problem that your system does not have all of the depencies needed to build all of the packages. Under ubuntu you need for example to install libxml2-dev to be able to install the xml package. For that you need to use the package manager of ubuntu. How to do that on an airgapped system is another issue

How to import the installed packages for `renv` by default

I want to test the potential function of the collective maintenance of the R scripts across individuals. I try to work on Rstudio project together with the Could software eg. Dropbox and the version control (eg. Git), so we can have all the records of all the updates from different maintainers. Therefore, I try to test the new released R package renv.
On my Mac OS, my newly installed packages are available in the 1st directory as I listed below.
.libPaths()
## [1] "/Library/Frameworks/R.framework/Versions/library"
## [2] "/Library/Frameworks/R.framework/Versions/3.6/Resources/library"
However when I start the renv with the renv::init(). It only has the basic packages available. How can I move these installed packages into the global cache directly without the need to reinstall these pacakges?
You can simply call renv::install() (or renv::restore()) and renv will find packages already installed in the cache. It's possible because all the projects using renv share the global package cache, therefore, the project libraries are composed by symlinks associated to the global package cache.
In case that renv global package cache and the project library are installed in different disk volumes, renv will copy the package from the cache into the project library, instead of using a symlink.
In macOS, the default location of renv global packages caches is: ~/Library/Application Support/renv.
All the information was extracted from the following link: https://cran.r-project.org/web/packages/renv/vignettes/renv.html.
I hope it helps you. Good luck!

packrat init in an existing project does not initialise

I have a working project on Rstudio Server and I want to freeze all packages into the actual working versions to avoid future issues due to global library updates.
So I run
> packrat::init()
Initializing packrat project in directory: - "~/R/statistics"
After a while all packages are downloaded and installed with no apparent issue. So I run a command to get status:
packrat::status()
and I get:
Error: This project has not yet been packified. Run 'packrat::init()' to init packrat.
Restarted R session, Rstudio and even server, but to no avail. Same message no matter I insisted. Maybe this is not the correct procedure to add packrat to an existing project but packrat documentation is not very complete.
Rstudio Server version 1.1.456
R version : 3.4.4

How to redirect R to local repository?

I have rstudio connected with rconnect (both are rstudio products or tools) on Linux OS. When i try to publish shiny application documents (app.R) from rstudio, process or log says:
Installing required R packages with 'packrat::restore()'....
Curl: (6) Couldn't resolve host 'cran.rstudio.com'
I don't want R to go outside (like cran.rstudio.com) to find any R packages since i have R packages already installed.
Question: What is the setup or configuration i can perform to tell rstudio or rconnect to go to specific directory to find packages if require?

Resources