How to load packages automatically when opening a project in RStudio - r

Every time I restart RStudio-it requires me to reload all of the packages that were loaded in the workspace previously. I can't seem to figure out what the problem is, RStudio is saving the projects when it closes them.
How can I make sure that RStudio reloads the necessary packages when I open the project?

I presume you want to say that you have to reload all of the packages that were loaded in the workspace previously. That's not an error, that's by design.
If you want to load some packages at startup in a project, you can do so by creating a file called .Rprofile in the project directory, and specify whatever code you want RStudio to run when loading the project.
For example:
cat("Welcome to this project.\n")
require(ggplot2)
require(zoo)
would print a welcome message in the console, and load ggplot2 and zoo every time you open the project.
See also http://www.rstudio.com/ide/docs/using/projects

In general there's nothing different to default package loading in RStudio than in R (How to load packages in R automatically?). Upon startup R checks for an .Rprofile file in either your local, or fail, that, home or install directory (on Mac/Linux: ./.Rprofile or else ~/.Rprofile) and executes it, and hence any options(defaultPackages...)) or other package-load-related commands it contains.
The only small difference is that RStudio "helpfully" changes your default path before startup see "RStudio: Working with Projects", so you might load a different or missing .Rprofile or the wrong .Rprofile, depending on whether you've opened an RStudio Project or just plain files, and what your RStudio default working directory is set to. It's not always clear what directory you're in, so sometimes this causes real grief.
I tend to use RStudio without defining my code as an RStudio Project, simply because it's heavy-handed and creates more files and directories without adding anything (to my use case, anyway).
So the solution I found to maintaining .Rprofile and making sure the right one gets loaded is a trusty old Unix link from the project directory to my ~
ln -s ~/.Rprofile ./.Rprofile
(If you're on Windows it's more painful.)
You don't need to have one global .Rprofile, you could keep task-specific ones for different types of projects, or trees, or (say) a .Rprofile.nlp, .Rprofile.financial, .Rprofile.bio and so on. As well as options(default.packages, you can gather all your thematically-related settings: scipen, width, data.table/dplyr-specific options, searchpath...
Power tips:
obviously keep backups or SCM of your valuable .Rprofile(s)). Definitely make sure git is tracking it, so don't put it in .gitignore
if you have multiple .Rprofiles, put a cat("Loading .Rprofile.foo") line in each one so you can see from console that the right .Rprofile.xyz got loaded
after every project, revise, trim, tweak your .Rprofile; add new use case stuff, comment out irrelevant stuff, commit the changes to git

Related

RStudio hangs for a specific project. What file needs to be changed?

I have a project that I've been working on for several months without a problem. Yesterday I tried to profile a bit of code using the raster package that was taking a very long time to run. I left it running overnight and found RStudio unresponsive in the morning. Now when I open that project, I can't do anything except to force quit RStudio. Other projects appear to work fine.
I suspect something bad is stored in the file(s) that remembers where I was. Is there one or more that I can delete and regain control of the project. Candidates in the project folder would seem to be one or more items in the .Rproj.user directory.
I found a tip on the RStudio website. In a terminal, navigate to the project directory and rename .Rproj.user to something different. I used this command - mv .Rproj.user .Rproj.user_old. This keeps the user-specific information around in case you want to go back to it.

Change home directory for R

Somehow, my home and library directories in R got changed to a cloud location, which is messing up a bunch of paths, and now, I can't seem to change it back. When I type path.expand("~") in R, I get back "C:/Users/MyName/OneDrive/Documents", but I was expecting to get "C:/Users/MyName/Documents".
When I try .libPaths(), I get "C:/Users/MyName/OneDrive/Documents/R/win-library/3.4" and "C:/Program Files/R/R-3.4.3/library", but I only want the latter.
I have tried uninstalling and reinstalling both R and RStudio (thus now working with the most-recent versions of each), but the cloud path persists. I have seen posts elsewhere on SO about setting things in the .Rprofile file, but I don't think that's the right option, especially since the .Rprofile file would then have to be in a cloud location, which I don't want.
I have looked at my environment variables in the control panel (I'm on Windows 10) and looked at PATH, but there's nothing there that specifies the cloud directory, so I don't know where it's coming from.
How do I permanently change my home directory and also make sure that .libPaths is pointing to only the actual library directory?
This is based on your windows environment variable HOME. You need to reset HOME to the path that you want "C:/Users/MyName/Documents"
If you want to do that from within R, you can use:
Sys.setenv(HOME="C:/Users/MyName/Documents")
This change would not be permanent. If you wish to avoid doing this every time you run R, you could put the above statement in your .Rprofile file. There is a nice article on setting up your .Rprofile in the RStudio support
Since you mention you are
in Windows 10
you can also set the R Home Directory just for R without changing your system HOME with a special environment variable,
R_USER
Adding this to your Environment Variables with the path you want for your R Home will set the R Home path without changing your system HOME.
RStudio looks for R_USER first (and then moves on to HOME).

Commit a large number of files in RStudio using GIT panel

In RStudio, if you are dealing with a directory that contains a large number of files, and you want to commit and push the recent changes (that you made on all of them) to your repository, the GUI Git component gets super slow and practically doesn't work. Any idea?
Of course you can ignore the GUI and stick to the command-line Git forever, but if you don't want, a quick jump to the command-line git would solve this problem for now.
The temporary solution that I found is as follows:
Click on the blue-gear icon on the GIT panel, inside RStudio.
Select Shell (a terminal window will pup up!)
Write the add and commit command in the terminal:
{ATTENTION: The following command will commit changes on ALL files! You may want to use what is appropriate for your situation!}
git add -A && git commit -m 'staging all files'
Now you can go back to the GUI Git, and click on push button. All files that you staged in the terminal window, will be pushed up to your repository.
My workaround today was to...
create a man1/ directory in my RStudio project (after force-closing and
relaunching RStudio a few times after I had done something that caused it to hang again),
include man1/ in .gitignore,
move just about everything in man/ to man1/,
delete the .git/index.lock file in the repository,
futz around with RStudio, until it was responsive enough to make the (small) commit of files from man/,
pull and push, so that the remote main was once-again fully synched
copy some files from man1/ to man/, commit these
repeat and rinse of steps 6 and 7, until there's nothing left in man1/
Delete man1/ and its entry in .gitignore
My recipe above isn't one-size-fits-all... for example, you may have run into a "diff is too large" difficulty with RStudio because of a single oversized file, rather than (as I had) with "too many" small files. If you're trying to commit a monstrously-big set of diffs from a single file... you should be including this file in your .gitignore, rather than expecting it to be version-controlled without any difficulty by git. Also, if you're locking up RStudio's Git-interface because of "too many" files being committed simultaneously, your first port of call should I think to commit directories one at a time (but do be sure to push & pull after each commit).
And... I'm not going to complain about this defect in how RStudio interacts with git!
Instead I'll close with some kudos. After just a few days of futzing around with RStudio, I'm finding it to be so much easier than what I remember about hacking on S via emacs in the early 1990s. RStudio handles just about everything (especially the ROxygen documentation workflows) than the emacs/eSS setup that I had been struggling to get fully operational earlier this month.
I'm also impressed with how R has developed since the last time I looked at it -- about 20 years ago! The semantics lurking in corner cases are still very "surprising", to put it mildly ;-) But I do appreciate how it has maintained compatibility with the truly-bizarre and primitive semantics of S while allowing its power-users (which will surely never include me!) to write expressively, elegantly, and with an appropriate balance between concision and the write-only alphabet-soup of APL and its ilk (https://en.wikipedia.org/wiki/Write-only_language)

R Server - Resuming R Session - message hanging or taking 15+min

I frequently work on a R-server environment. However, whenever come back to my work following the last working day, the system often gets stuck with 'resuming r session'. This might take upwards of 5-15min. I try to terminate R or restart R but often this doesn't really do anything.
I'm looking for a work-around as it is very frustrating to go to the R-server URL and to have to wait forever to get started again. IDEALLY, I'd be able to pick up right where I left off. However, if this can't be done, I guess that is ok….
I was looking around at the folder structure and I noticed that there is a folder called "Suspended-R-Session".
Within this folder are a few files such as:
'options',
'lib paths',
'history',
'environment_vars',
'environment',
and 'settings'.
Should I be deleting these files in order to speed up load time???
As described in this link https://support.rstudio.com/hc/en-us/community/posts/200638878-resuming-session-hangup, in my case for R version 3.5:
cd ~/.rstudio/sessions/active/session-45204d30
rm -rf suspended-session-data

Running qcollectiongenerator during application compile process

I've been working on a program called RoboJournal for a long time. The next release has full documentation included; Whenever the user presses F1 or clicks the Help item in the RoboJournal program, the help file is displayed in Qt Assistant (way classier than simply opening a browser window to some online documentation).
In its base form, the documentation consists of lots of loose HTML and image files included in the source package. These loose files are supposed to be compiled into a QCH compiled help file and QHC collection file during build time so Qt Assistant can display the documentation properly. On Windows, this was fairly easy because I was able to write a batch script to automate the entire build process (including compiling the documentation and moving the output files to the right place).
On Linux, it's a bit more complicated. True, I could write a Bash or Perl script that compiles the documentation along with the rest of the program but I have no guarantee that the people who will eventually create my app's Debian packages from the source package I give them will use the script. The source package is used to create all the Debian packages so everything has to work flawlessly with the standard build procedure (or the source package is worthless). Therefore, I need the compile process to produce the same results whether the user runs the script or not. As it is now, the user has to build and install the documentation manually. Surely there's some way to automate this.
Is it possible to have Qmake add instructions to run qcollectiongenerator to the makefile (in order to build my application's help files) so it gets handled properly during the "make" step? That way, the QHC and QCH files will be ready to install to their proper locations (in my case, /usr/share/doc/robojournal-0.4.1) along with everything else when the user runs "make install". I've considered compiling the QCH and QHC files in advance and providing them in the source package but the whole point of building from source is to be able to re-create the entire app from its base components.
I know I probably have to add additional instructions to my .PRO file but I’m not sure what or how. I've found something that looks promising (http://www.qtcentre.org/archive/index.php/t-49484.html) and gives me hope that it's possible for Qmake to do what I need but I’m not sure how applicable those instructions are to my situation. Do I have to create a PRI file just for this or can I add the instructions directly to the main project file?
You can try to use the QMAKE_POST_LINK variable:
QMAKE_POST_LINK += build_help.sh
P.S.
I have no guarantee that the people who will eventually create my
app's Debian packages from the source package I give them will use the
script.
I think this is really not your problem :) It's up to them to properly build the package.

Resources