R studio proxy / no proxy switching - r

R-studio integration with github works behind my corporate firewall when I set the proxy in the .Renviron file
http_proxy = http://<proxy>:80
https_proxy = http://<proxy>:80
In my case I don't need to specify user name and password, which I don't want hard coded anywhere.
When I work from home I get errors since R-studio is trying to find the proxy and failing.
Is it possible to write to the .Renviron file so that
Try using the proxy
if works then continue
if fails then ignore proxy settings and continue.

Check first if you can use "Conditional file and directory names"
If the name of a file consists of a <key>=<value> specification, then that file will be included / used only if the specification is fulfilled (on the current system with the current R setup).
For instance, a file ~/.Rprofile.d/os=windows.R will be ignored unless startup::sysinfo()$os == "windows", i.e. the R session is started on a Windows system.
you could use a customed rprofile filename to:
unset HTTP(s)_proxy
only if, for instance, hostname=YourHomeLaptop
That is
~/.Rprofile.d/nodename=YourHomeLaptop
If you are using the same laptop, use:
dirname
the same project with two clones of the same repository
You can use a different profile that way, with an easy git pull from one repository (for home work) to the other (for office work), and vice-versa.

Related

Using Renv behind a proxy without password in plaintext

I'm working on R projects behind a proxy server, which is why I use the keyring library to store my proxy credentials and to authenticate on the proxy manually whenever it is required. This way, I don't need to write HTTPS_PROXY=http://usr:pw#proxy:port somewhere in plaintext - neither in global environments nor project wise. Of course, on runtime, Sys.env does contain this string but at least only for the session.
So far so good. Now I need to use virtual environments because of some package version mismatches in my projects. For that I created renv:init(). After closing and reopining the package, Rstudio seems to freeze during loading the package. I guess renv somehow tries to reach the packages (some are on cran, some are on local gitlab), which cannot work as the proxy is not set.
When I create a .Renviron including the proxy settings with my username and password, everything works fine.
Do you know a way to prevent renv to try to connect to the package sources at project start? Or do you think the problem lays somewhere else?
My best guess is that renv is trying to make a request to the active package repositories on startup, as part of its attempt to verify that the lockfile + library are in sync. If that's the case, you could disable this via:
RENV_CONFIG_SYNCHRONIZED_CHECK = FALSE
in your .Renviron. See https://rstudio.github.io/renv/reference/config.html for more details.
Alternatively, you could tell renv to load your credentials earlier on in a couple ways, e.g.
Try adding the initialization work to either your project .Rprofile or (with RENV_CONFIG_USER_PROFILE = TRUE) your user ~/.Rprofile;
Try adding the initialization code to a file located at renv/settings.R, which renv will source relatively early on during load.

set R_USER for multiple users under windows

Windows seems to put R libraries in a onedrive directory by default if onedrive is in use. This is undesirable especially if you're using both R and onedrive on multiple computers with the same onedrive account.
How would I set my library to be put inside of C:\users<username>\documents instead of in C:\users<username>\onedrive\documents? There are good solutions here (How do I change the default library path for R packages), but they're mostly focused on solving this for a single windows account. Is there a general way to solve it for all accounts?
Every R installation has an etc/ directory with configuration, in it you can set Rprofile.site or, easier still, Renviron.site.
Files ending in .site should not get overwritten on the next install. Make sure you don't bulk delete though.
You can query where it is via R.home("etc"). On my (Linux) system:
> R.home("etc")
[1] "/usr/lib/R/etc"
>
Really excellent solution from here (https://github.com/r-windows/docs/issues/3):
just create an Renviron.site file in the /etc folder of your Rinstallation, then copy the following line to it:
R_USER=C:/Users/${USERNAME}/Documents
This sets R_USER which in turn sets R_LIBS_USER according to the user directory of each account under windows 10.

Is any extra set up needed to run vscode-r in remote SSH?

I've been using the session watcher feature in vscode-R, and it works great locally. I was wondering what sort of special configuration is needed to get it working in a remote environment?
If I just use VSCode's instructions to connect to a remote host using the Remote Extension I can get a remote terminal and start radian, but I don't see the anything in the task bar indicating the R session it's attached to (unlike the local version, which works just like in the documentation). None of the features (e.g. showing plots, documentation, variable completion, etc) work.
Do I need any extra set up in the remote machine? Let me know if more information is needed. Thanks!

.rprofile not source when creating RStudio project

In Windows 7, I have my .Rprofile in a custom location (not R_HOME, not HOME). I informed the OS of this location via the user environment variable R_ENVIRON_USER pointing to this location. There is no other .Rprofile anywhere else.
In RStudio, I set the default working directory (when not in a project) to this same location
When not in a project, the .Rprofile is properly sourced. However, when inside another project or when creating a new one, the .Rprofile is not sourced.
How do I ensure, that my .RProfile is properly sourced even inside projects (assuming there is no project-specific .RProfile inside the project dir)? I thought the environment variable would take care of that.
Answer & Update
I had to set the environment variable R_PROFILE_USER and provide the full path and filename of the .Rprofile. In a command prompt, I typed:
SETX R_PROFILE_USER "C:\Users\tspeidel\OneDrive\.Rprofile"
You misunderstand what R_ENVIRON_USER is for; it sets a value to source an (optional) .Renviron file for the user from the location it provies.
It does not affect what the system thinks your home directory is. That is still governed by HOME which you set on Windows with the same UI. And you can't just substitute R_HOME for it.
You can however read very carefully what R tells you about its process in help(Startup). Which is, as often, somewhat dense and terse but it does get to the real meat. In short, I think you may want to use another variable to point to alternate Rprofile.
None of this has anything to do with RStudio which, after all, just calls R for you (and cannot, as a running process, alter HOME).

Changing Git protocol for RStudio project already under version control in Windows

I love using RStudio for it's built-in integration with version control systems. However with RStudio on Windows is there a way to change the Git protocol from http to ssh or vice versa for a project already under version control without first having to delete and recreate the project?
I might be missing something, but I originally cloned my repo using http which I subsequently found to be a massive pain because every time I want to push project changes to GitHub I have to re-enter my username and password. So I removed the project from version control(Project -> Project Option -> Git/SVN -> Version Control System: none) and then tried to re-add version control hoping to use ssh but it will only allow you to go back to the original protocol you selected when creating the project in the first place.
The only way I have found to change protocol it is to delete the project and then create a new project from GitHub using the correct ssh parameters. I'd really like to be able to change projects version control protocol from http to ssh without deleting and re-cloning first.
Is this possible?
Check out git config and the whole configuration stuff. You can configure several remotes to make the "distributed" aspect of git work.
You can try just copying the whole repository (or just .git/config, keep a copy!) and check what happens with your specific case when you change the configuration. It depends on lots of things that aren't under git's control, like firewall configurations en route, and the configuration on the other end.

Resources