I was developing a shiny-app on a Windows machine in Rstudio. Now I need to develop it on a Linux machine and later on will need to deploy it on a server. Because of the need to run the app across platforms, I was looking into some environment control application so that I don't have to tune and reinstall the needed packages manually.
I tried to use Packrat for this purpose. I made a snapshot on my Windows machine, copied and pasted the project to my Ubuntu, reopened the project on RStudio, installed Packrat and the restore of the packages ran automatically. Then I ran into an issue that some of the packages that requires system dependencies were not installed automatically, e.g. rgdal and jqr. Therefore, I had to manually install those system dependencies manually in a terminal (it took me a while because there were about 10 of them that requires extra system dependencies).
I am wondering if there is an easier way to just automatically handle this. Later on, I will need to work with a system administer to deploy the app to the server. I am wondering if Packrat has the capability of automatically installing system dependencies on Linux machine/server. If anyone has encounter this issue before, or have other better options, please let me know!
Thank you!
Hello and welcome to StackOverflow.
You are facing a question that is actually much harder to tackle than you may think at first---deployment of complex R package dependencies across different operating systems is a truly hard and, truth be told, unsolved problem!
You can of course use packrat and renv for R package dependencies and snapshots of particular versions. But this does not do anything for system-level dependencies which are simply taken as "given". So no to just transfering to another box and saying "abracadabra". Sorry!
The closest we all may have gotten to fixing this may be Docker where you can create a portable unit of execution that can be deployed whereever Docker run: Windows, macOS, different Linux flavours, ... as it encodes everything.
Related
I'm trying to get a shiny app deployed on Shiny Server. I can do that without any issues, but when trying to deploy an app that has a number of dependencies (remote and local) we keep running into issues.
We used renv to track the dependencies (on the Windows dev box) and rebuild it from scratch on the Linux prod box, but even though the dependencies are rebuilt and some get loaded, others do not. The .Rprofile of the user running the app is pointing to the renv activation script.
For the sake of clarity, we need and want all the R code to be built from the source code on the Linux box.
What is the best or standard way (or even a poor way that works) to deploy the libraries for a shiny app to the shiny server? Is renv even the right tool for this scenario or is there a better tool?
I've tried reading the shiny server documentation and the closes it only mentions that it uses the .Rprofile of the user running the app, but there doesn't seem to be any sort of guide on the best way to deploy dependent libraries.
This renv documentation discusses some reproducibility caveats:
system dependencies, and
changes in CRAN (e.g. a binary no longer being available).
Since you are moving from a Windows to a Linux system your packages may have unmet system dependencies (things that need to be installed outside of R) that you didn't encounter in Windows. For example, rJava is required for some of the Excel-related R packages, and getting its related system dependencies installed and working on Linux can sometime be a challenge. You can use the RStudio Package Manager Website to figure out what system dependencies are required for different R packages for your particular Linux OS. Also, the error messages you get when running these apps on Linux should point you in the right direction. These system dependencies are what you'll have to manage yourself since renv doesn't.
But for a more production-level solution you can try Docker and ShinyProxy. For apps with many dependencies or especially external dependencies (e.g. Python, SQL, etc.) you can guarantee more reproducibility using Docker. ShinyProxy can be used to host apps built into docker images. This is more work, but you ensure the entire system is reproducible, not just the R version and R packages. ShinyProxy also adds additional hosting capabilities like user authentication.
I've just joined a new office and their security is very tight. Essentially, we cannot go online without connecting to another machine. This means any applications that attempt to connect online won't connect to anything.
I'm trying to set up atom for python development (I've not used atom before and it's all that available to me!) - but the lack of internet is causing an issue.
I understand that to install a package, I can download it from github, and extract it to ~/.atom/packages - and this works! But what do I do with packages with dependencies that haven't been downloaded? Is there a simple way to get the package and the dependency whilst being offline?
I've also noticed that although my office has atom installed there's no 'apm' or 'npm' commands in the terminal...is this common?
thanks
Due to recent experience with several bugs created by updating packages, I wonder what the best approach is for the following problem:
I currently provide a stand alone version so to say of my shiny App (just the script files to run it locally) and run a long list of require() functions to load / install the needed packages. However, in the end I would like to use fixed package versions to avoid bugs created by changes in packages.
Is there a way to ensure that the user, who may have older or newer versions of packages on their computer, is using the right version of all the packages my app needs?
You can consider using packrat: https://rstudio.github.io/packrat/.
Unfortunately, private libraries don’t travel well; like all R
libraries, their contents are compiled for your specific machine
architecture, operating system, and R version. Packrat lets you
snapshot the state of your private library, which saves to your
project directory whatever information packrat needs to be able to
recreate that same private library on another machine.
Short tutorial:
RStudio - File - New Project - New Directory - New Project - "Do: use Path" - Create Project
Enter in the R(Studio) console:
Code:
packrat::init()
.libPaths() # test if libpath has changed
install.packages("reshape2") # installs within one of the packrat libpaths
Installing package into ‘C:/R/packRatTest/packrat/lib/x86_64-w64-mingw32/3.4.3’
Assumption would be that you can use and share RStudio Projects, but i think it would be hard to work without them anyway ;).
Try writing your shiny app as a package. You can, somewhat, control that through the description file.
Since you said you're using script take a look at: https://github.com/chasemc/electricShine
Even of you don't use it, hopefully looking at the code will help for things like setting the download repo to be a specific MRAN date.
I'm trying to use a linux server with R installed. Apparently the R system library has old versions of non-base packages installed like dplyr and testthat.
Because i don't have permission to edit the system library, i'm unable to update the packages.
My plan is to only use a user library, so I can controll the package versions myself. However i'm unable to remove the "/usr/lib64/R/library" folder from .libPaths(). I tried changing the environment variables R_LIBS_SITE and R_LIBS with the .Renviron and .Rprofile files to a different folder, but the /usr/lib64/R/library folder will always be present. Removing it with the command .libPaths(.libPaths()[1:2]) doesn't work either.
Is there a way to remove the system library from .libPaths(), so I'm not depending on the update policy of the server admin?
You can't remove the system library, because that's where the base packages live. They can't be installed anywhere else, and R won't work without them.
Best would be for you to get your sysadmin to update the system library. Those obsolete packages probably contain bugs.
If you can't do that, then run update.packages(instlib = "local") to install all the latest versions in the library named "local". (Substitute your own local lib name, of course.) This requires all your users to specify .libPaths("local") when they start, and some will likely forget, so it's not as good.
It might be easiest for you to just install a full copy of R in your own account. Then you'll have control of things, and anyone using your copy will get your library.
(There's a new release (3.5.3) coming in ten days; you might wait for that, or install one of the betas or RCs, which should be available now, then update again when the final release arrives.)
For me, it works to use
.libPaths(.libPaths()[2:1])
This will still search the system library, but only after it searches my personal library, so if I have a newer version, it uses that. Note: I used .libPaths()[2:1] not .libPaths()[1:2]
i've been getting up to speed using R of late, and am wondering what the most efficient way is to clone an RStudio environment, especially the package installations, from one machine to another. i'd like to be able to switch from my desktop machine to my laptop, but i am adding packages very frequently to the desktop as i work and would like a simple way to make sure the same packages get installed on the laptop.
any help much appreciated
ps. not everything i'm installing is from CRAN...some are packages taken from github
If you have more than a couple of machine to maintain with the same R configuration, I think you should consider setting up your own local R repository.
And I will just redirect you to another SO question here:
Creating a local R package repository
You can also find the most useful information in the R manual.
Once this is done, you just have to update the local R repository and the packages will be updated on all machines, Windows or Unix
You can just copy and paste the folders in the R libraries between machines. As long as it is the same operating system on both machines there should not be any problem. If you want it to be automatically synchronised then place the R libraries into something like dropbox so that adding or updating a package will automatically appear on either machine with the next sync.