When I work on a project in R, there are many packages that may be updated or changed overtime. When I finish a project, I wish to save all of the packages at that time point. This way I can reproduce the result on my previous project if I can "restore" all of the packages in R used to produce the previous result. Is there a way to do this "save" and "restore" all of the R packages locally (without updating them to the most recent version)? Thank you
One option: you can do an install into a specific directory, then in your R code load the library from that location. For example, to snapshot the forecast library you can use:
install.packages('forecast', lib='~/R/library_1')
followed by
library('forecast', lib='~/R/library_1')
of course, you code would need access to the library directory if you were to share it
Related
I am new to R and just wondering if for every dataset I need to work on, I need to re-install tidyverse? I noticed that on the google Data Analytic program, we are always asked to install. Package("tidyverse")
No, you only have to install it once. The program suggests installing it every time to make sure you get it if you don't already have it.
After the first time you install it, it becomes part of your package library, so it is available for scripts to use as long as the package library remains accessible. You can read more about packages and libraries here:
https://hbctraining.github.io/Intro-to-R-flipped/lessons/04_introR_packages.html
Due to recent experience with several bugs created by updating packages, I wonder what the best approach is for the following problem:
I currently provide a stand alone version so to say of my shiny App (just the script files to run it locally) and run a long list of require() functions to load / install the needed packages. However, in the end I would like to use fixed package versions to avoid bugs created by changes in packages.
Is there a way to ensure that the user, who may have older or newer versions of packages on their computer, is using the right version of all the packages my app needs?
You can consider using packrat: https://rstudio.github.io/packrat/.
Unfortunately, private libraries don’t travel well; like all R
libraries, their contents are compiled for your specific machine
architecture, operating system, and R version. Packrat lets you
snapshot the state of your private library, which saves to your
project directory whatever information packrat needs to be able to
recreate that same private library on another machine.
Short tutorial:
RStudio - File - New Project - New Directory - New Project - "Do: use Path" - Create Project
Enter in the R(Studio) console:
Code:
packrat::init()
.libPaths() # test if libpath has changed
install.packages("reshape2") # installs within one of the packrat libpaths
Installing package into ‘C:/R/packRatTest/packrat/lib/x86_64-w64-mingw32/3.4.3’
Assumption would be that you can use and share RStudio Projects, but i think it would be hard to work without them anyway ;).
Try writing your shiny app as a package. You can, somewhat, control that through the description file.
Since you said you're using script take a look at: https://github.com/chasemc/electricShine
Even of you don't use it, hopefully looking at the code will help for things like setting the download repo to be a specific MRAN date.
I have two different versions of R installed, one which is up to date and which I use for all my regular R coding (needs to be up to date so that I can use various updated and new packages) and one which I use to access OLAP cubes (needs to be the R Client from Microsoft, because this is the only one which supports the olapR package, and which currently uses R version 3.4.3).
Since, in theory, I only have to access the OLAP cube once a month, I "outsourced" this task to a different RStudio project, in which I download and save the required data for all other projects. Hence, all other projects never require the olapR package to be installed and can and will be run in the up to date R version.
Now, ideally I would like to link my R version to my projects, so that I do not have to change my global R version and restart RStudio every time I access the OLAP cube or work on this data retrieval project (and then switch it back). However, I could not find any options in RStudio to achieve this result.
There are a few threads out there describing the same problem, but with no satisfactory answer in my opinion:
https://support.rstudio.com/hc/en-us/community/posts/200657296-Link-Project-and-R-Version
Rstudio project using different version of R
I also tried looking for a different package than olapR but with similar functionality, but could not find anything except X4R, which seems outdated and does not work for me (https://github.com/overcoil/X4R). Sadly, I am also unable to directly access the databases which the OLAP cube uses for its results, so I cannot go "around" it.
I am happy for any help or suggestions you can offer, whether it is a general workaround to link a project to a specific R version or the (less helpful for the community) solution of accessing the OLAP cube in a different way.
Thanks in advance!
Using the answer from MrGumble I created a .bat file that will execute my .R file using the desired R installation. Even though it is not the answer I thought I would get, I think it is an even better solution to the problem.
For all facing a similar issue, here is the .bat file (never created one before, so also had to google how to do it and I guess some might be in the same position):
#echo off
title Getting data for further processing in R
echo Retrieving OLAP data
echo.
"C:\Program Files\Microsoft\R Client\R_SERVER\bin\Rscript.exe" "C:\Users\me\Documents\Projects\!Data\script.R"
echo.
echo Saved data
echo.
pause
Thanks again to MrGumble for his help.
Skip RStudio.
RStudio is really just an editor (albeit powerful and useful) editor, which starts an R console for you (and the surrounding PATH variables, library locations, etc.).
If your monthly task only requires you to run the R-script (or a bit of interactive work), you can simply execute your preferred version of R from the command line and have it run your R script. E.g.
C:\Users\me>"C:\Program Files (x64)\Microsoft R\bin\Rscript" myscript.R
You might have to define some PATH variables so that the older R doesn't look for packages in the newer R's libraries, but that depends entirely on your current setup.
Imagine that I open two session R.
In the first (R1) I loaded the package dplyr.
Now, my questions is, is there way to get the sessionInfo/packages loaded in R1
through R2??
UPDATE:
I am writing a R help system in Atom editor. Atom editor currently not support the function help of R. So i am creating one. And to find the help of the function you need to search into packages where this function is, the best way is know what packages are loaded in your current session R. And that is my difficult. One way to solution this is to forgett the loaded packages and search in all installed packages, but it is to slowly if you have a lot of packages installed.
So in my script R i have a line that has this code:
pkg <- .packages() # all packages loaded in this currently session
But when I run this script R1 in other script R2, it not get the packages loaded in the currently script R2, but the script R1.
Use the Services API to interact with Hydrogen
The following details interacting with other packages in atom: http://flight-manual.atom.io/behind-atom/sections/interacting-with-other-packages-via-services/
Hydrogen is an interface to a jupyter kernel. It's is maintaining the session with the kernel, and it has a plugin API currently which you could use to get the connection information to the backing kernel. https://nteract.gitbooks.io/hydrogen/docs/PluginAPI.html. Using that you could send your call to packages().
There is also r-exec, but I believe that's Mac only. In that case, you could get the
This a follow-up to How to edit and debug R library sources. I'm wondering if there's an easy way to edit an R library source file and cause the edited file to be loaded by library without reinstalling the code. I'm asking this in the context of a library that I'm developing and am looking for an easy way to incrementally edit and test my code. I know about source and other ways of loading code into an R session, but I want to test scripts that do the usual library thing.
Thanks!
It sounds like you're developing a packages. If that's the case then using the devtools package is probably what you want to do. The load_all() function will systematically reload all of your code so you can make changes and test everything out.