Determine Path Used by R (RStudio) After file.choose() - r

Is there a way to determine what file path was used after loading a file into RStudio using file.choose()? I was trying to use an absolute file path to load a file, to which RStudio tells me:
Error: `path` does not exist:
I suspect this is occurring because I may have done some damage to my R setup recently (I had issues after installing the latest R, and tried to troubleshoot with online support to correct my .Rprofile and .Renviron files, but I am NOT a programmer). So now R is not recognizing the absolute path to my file, and I'm forced to use file.choose() as a workaround. Ideally, I'd like to see what path file.choose() used so that I can
A) use the correct path reference, and
B) have a better sense of what is going on with my R setup
Any suggestions?

Related

Can I change the location of Homebrew FFTW install? R can't seem to read FFTW3.h file located in Cellar folder

I'm trying to install wholebrain by Daniel Fürth, following the instructions on the macosX install page (available here). I am running MacOS Big Sur 11.5.2, R 4.1.2, and RStudio 2021.09.1.
Unfortunately, the program is not straight-forward to install and requires significant developer tools to work correctly. I'm not a programmer and have almost no experience with coding, so I've been mucking through the instructions for two days now trying to get the install to work correctly and I'm firmly stuck on the final step.
In RS, when I run, devtools::install_github("tractatus/wholebrain", INSTALL_opts=c("--no-multiarch")) I get the following error message:
/bin/sh: pkg-config: command not found filter.cpp:9:10: fatal error: 'fftw3.h' file not found #include "fftw3.h" ^~~~~~~~~ 1 error generated. make: *** [filter.o] Error 1 ERROR: compilation failed for package ‘wholebrain’
I have been trying to figure out what this means for quite awhile now and I think I've narrowed it down to R is not reading the location of the fftw header file from where it was installed by Homebrew. (I could be totally wrong, again- not a programmer)
From what I understand, Homebrew always installs under opt/homebrew/cellar. And, in fact, in there is the compiled fftw program with the needed "fftw3.h" file. But for some reason, RStudio is not able to find and read the file in that location.
From random googling and reading of other posted issues, I think that RStudio may expect the file to be under usr/local/include. Can I just copy and paste the header file into that folder? Or will I be screwing something up if I do that? I am totally intimidated by fftw's description of manual compilation so I don't really want to attempt that. Is there a way to change where R is looking for that header file? I already set my wd to "/" so shouldn't R be able to access any folder on my computer?
I want to post an answer here for anyone who comes after me with the same issue. It came down to RStudio not recognizing the programs Homebrew had installed because it wasn't reading the file location where Homebrew saves them. Homebrew always installs programs in /opt/homebrew/... Here is what I had to do:
In RStudio, open your Renviron file using this command: usethis::edit_r_environ()
In the file that opens (which for me was totally blank), type: PATH=/opt/homebrew/bin:${PATH}, or whatever your particular path you want prepended to the Renviron path is.
Quit RStudio and, when prompted, save. Re-open RStudio and run Sys.getenv("PATH") to check. Your new path (in the example above, '/opt/homebrew/bin') should now be prepended to the list of paths that RStudio will use when looking for programs/files. For me this now looks like /opt/homebrew/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/opt/X11/bin:/Library/Apple/usr/bin:/Applications/RStudio.app/Contents/MacOS/postback
Finally, I want to say thank you very much to Mark Setchell who really helped point me in the correct direction!

R Markdown: openBinaryFile: does not exist (No such file or directory)

I've developed a shiny app that allows user to download a HTML report via R Markdown. I'm trying to include custom css and images into my rmarkdown file. However, I keep getting this error message:
pandoc: Could not fetch (either css or image file)
openBinaryFile: does not exist (No such file or directory)
When I knit the .rmd file on R Studio, it is able to reference the image file or css that I want. However, when I run the Shiny app and download the html file, I get the above error message. I've even tried to put the images and css files in the same working directory as the .rmd file, but to no avail...
output:
html_document:
css: pandoc.css
(same error message as above)
Been trying to find a solution for this but can't seem to...can anyone help here?
I just had this issue as well, but for me the reason was that the RStudio project was on a shared drive, and I had opened it through the network location. The problem was resolved when I closed out of the project, and opened it back up through a mapped network drive. (If when you run getwd() your location starts with \\, this is probably what is happening to you.)
I had a similar problem. I was not using the full file path. I was using ~/path/to/file. I changed it to the full path (i.e. removed the ~/) and it worked.
I recently ran into the issue on my Windows work-computer where I simply set the .libPaths() in the Rprofile.site file. This in line with previous answers but a little more detailed.
Step 1
Check your current paths:
> .libPaths()
[1] "\\\\my_work_server.se/some_subdir$/username/Dokument/R/win-library/3.6"
[2] "C:/R/R-3.6.3/library"
Step 2
Look for the \\\, in this case it is the path "\\\\my_work_server.se/some_subdir$/username/Dokument/R/win-library/3.6". This path is most likely some already mounted home directory, in my case it is H: = "\\\\my_work_server.se/some_subdir$/username/. If you don't have a mounted directory you may want to fix this first or change the library path to another.
Step 3
So if you've installed R under C:/R/R-3.6.3/ you edit the file C:/R/R-3.6.3/etc/Rprofile.site and add:
.First <- function(){
.libPaths(c("H:/Dokument/R/win-library/3.6", "C:/R/R-3.6.3/library"))
}
Remember to change H: to where you have your mounted network directory.
Step 4
That's it, restart R and you should be able to knit your document.
The issue that I had with RStudio & the pandoc error (openBinaryFile error) was due to the file path in which the project was created and loaded.
Project File Path
When I created the project, I created it using the Universal path, which is the 2nd option in the image above. However, when I changed this to the mapped drive letter, the option above it, my pandoc error was gone.
I'm running RStudio 1.2.1335 and R version 3.4.4
I have been having a similar issue, with RStudio, rmarkdown and pandoc on a windows machine with network filestore. I followed various advice, mapped the drive to a letter and it still didn't help.
Eventually, I discovered that one of the paths in my libPaths contained the network location/Universal path. I updated that libpath to the mapped letter drive and everything seems to be going well!
I had a similar error message, but in my case the problem was that I used a # symbol in one of the chunk titles:
```{r show distribution of # of commits per month}
Which is no problem if you just run the notebook in Rstudio, but apparently confuses knitr/pandoc: when knitting, I got an error message like
pandoc: <path>/figure-html/show distribution of : openBinaryFile: does not exist (No such file or directory)`)
Removing the # from the chunk title solved the problem.
I just ran into the same message when running:
pandoc -s foo.html foo.md
Where I totally missed the -o flag as it should have been
pandoc -s -o foo.html foo.md
With the latter, everything is working like a charm.
I believe that I had the same issue. I first had tried Changing the Default Directory but every time I went to knit the RMarkdown file, I would get the same set of errors indicating that the process was still trying to access files on my Network's H drive rather that my local C drive, specifically it was looking in the rmarkdown library file on the network drive. I thought I was following the advice above, but after that did not work I tried deleting the rmarkdown folder in the network drive
eg: \\fwnew12\Home\My Documents\R\win-library\3.6\rmarkdown.
This seemed to force R to only use my local C drive (C:/Program Files/R/R-3.6.3/library) and finally successfully knit a PDF. Maybe this is not a recommended approach, but I just need something that works.

Treetagger koRpus package error

I am trying to use the Treetag function in the koRpus package.
The code I have used is
tagged.text <-treetag("C:/Rec_By_Others.txt",treetagger="manual",lang="en",TT.options=list(path="C:\\Program Files\\TreeTagger", preset="en"))
But I keep encountering with the following error.
Error in matrix(unlist(strsplit(tagged.text, "\t")), ncol = 3, byrow = TRUE, :
'data' must be of a vector type, was 'NULL'
What do I do ?
Your code seems correct to me, but I had the same error message. I could not find any solution for this problem until today. I finally found that I had a problem with the PERL installation, so I reinstalled a new version of PERL. Then, I checked if TreeTagger worked properly by applying the README TreeTagger instruction, that is:
Installation
Install a Perl interpreter (if you have not already installed one). You can download a Perl interpreter for Windows for free at http://www.activestate.com/activeperl/
Extract the zip file (if it was not extracted yet) and move the TreeTagger directory to the root directory of drive C:.
Download the parameter files for the languages you need, decompress them (e.g. using Winzip or 7zip) and move them to the subdirectory TreeTagger/lib. Rename the parameter files to -utf8.par Example: Rename french-par-linux-3.2-utf8.bin to french-utf8.par Non-UTF8 parameter files are not supported anymore.
Add the path C:\TreeTagger\bin to the PATH environment variable. The necessary steps differ from one Windows version to the other.
Open a command prompt window and type the command set PATH=C:\TreeTagger\bin;%PATH%
Go to the directory C:\TreeTagger cd c:\TreeTagger
Now you can test the tagger, e.g. by analyzing this file with the command tag-english INSTALL.txt If you install the TreeTagger in a different directory, you have to modify the first path in the batch files tag-*.bat using an editor such as Wordpad.
Note also that:
if you install the TreeTagger in a different directory, you have to
modify the first path in the batch files tag-.bat using an editor
such as Wordpad.
I hope this help.

How do I change the default library path for R packages

I have attempted to install R and R studio on the local drive on my work computer as opposed to the organization network folder because anything that runs through the network is really slow. When installing, the destination path shows that it's my local C:drive. However, when I install a new package, the default path shown is my network drive and there is no option to change:
.libPaths()
[1] "\\\\The library/path/I/don't/want"
[2] "C:/Program Files/R/R-3.2.1/library"
I'm running windows 7 professional. How can I remove library path [1] and make path [2] my primary for all base packages and all new packages that I install?
Windows 7/10: If your C:\Program Files (or wherever R is installed) is blocked for writing, as mine is, then you'll get frustrated editing RProfile.site (as I did). As specified in the accepted answer, I updated R_LIBS_USER and it worked. However, even after reading the fine manual several times and extensive searching, it took me several hours to do this. In the spirit of saving someone else time...
Let's assume you want your packages to reside in C:\R\Library:
Create the folder C:\R\Library. Next I need to add this folder to the R_LIBS_USER path:
Click Start --> Control Panel --> User Accounts --> Change my environmental variables
The Environmental Variables window pops up. If you see R_LIBS_USER, highlight it and click Edit. Otherwise click New. Both actions open a window with fields for Variable and Value.
In my case, R_LIBS_USER was already there, and Value was a path to my desktop. I added to the path the folder that I created, separated by semicolon. C:\R\Library;C:\Users\Eric.Krantz\Desktop\R stuff\Packages.
(NOTE: In the last step, I could have removed the path to the Desktop location and simply left C:\R\Library).
See help(Startup) and help(.libPaths) as you have several possibilities where this may have gotten set. Among them are
setting R_LIBS_USER
assigning .libPaths() in .Rprofile or Rprofile.site
and more.
In this particular case you need to go backwards and unset whereever \\\\The library/path/I/don't/want is set.
To otherwise ignore it you need to override it use explicitly i.e. via
library("somePackage", lib.loc=.libPaths()[-1])
when loading a package.
Facing the very same problem (avoiding the default path in a network) I came up to this solution with the hints given in other answers.
The solution is editing the Rprofile file to overwrite the variable R_LIBS_USER which by default points to the home directory.
Here the steps:
Create the target destination folder for the libraries, e.g.,
~\target.
Find the Rprofile file. In my case it was at C:\Program Files\R\R-3.3.3\library\base\R\Rprofile.
Edit the file and change the definition the variable R_LIBS_USER. In my case, I replaced the this line file.path(Sys.getenv("R_USER"), "R", with file.path("~\target", "R",.
The documentation that support this solution is here
Original file with:
if(!nzchar(Sys.getenv("R_LIBS_USER")))
Sys.setenv(R_LIBS_USER=
file.path(Sys.getenv("R_USER"), "R",
"win-library",
paste(R.version$major,
sub("\\..*$", "", R.version$minor),
sep=".")
))
Modified file:
if(!nzchar(Sys.getenv("R_LIBS_USER")))
Sys.setenv(R_LIBS_USER=
file.path("~\target", "R",
"win-library",
paste(R.version$major,
sub("\\..*$", "", R.version$minor),
sep=".")
))
Windows 10 on a Network
Having your packages stored on the network drive can slow down the performance of R / R Studio considerably, and you spend a lot of time waiting for the libraries to load/install, due to the bottlenecks of having to retrieve and push data over the server back to your local host. See the following for instructions on how to create an .RProfile on your local machine:
Create a directory called C:\Users\xxxxxx\Documents\R\3.4 (or whatever R version you are using, and where you will store your local R packages- your directory location may be different than mine)
On R Console, type Sys.getenv("HOME") to get your home directory (this is where your .RProfile will be stored and R will always check there for packages- and this is on the network if packages are stored there)
Create a file called .Rprofile and place it in :\YOUR\HOME\DIRECTORY\ON_NETWORK (the directory you get after typing Sys.getenv("HOME") in R Console)
File contents of .Rprofile should be like this:
#search 2 places for packages- install new packages to first directory- load built-in packages from the second (this is from your base R package- will be different for some)
.libPaths(c("C:\Users\xxxxxx\Documents\R\3.4", "C:/Program Files/Microsoft/R Client/R_SERVER/library"))
message("*** Setting libPath to local hard drive ***")
#insert a sleep command at line 12 of the unpackPkgZip function. So, just after the package is unzipped.
trace(utils:::unpackPkgZip, quote(Sys.sleep(2)), at=12L, print=TRUE)
message("*** Add 2 second delay when installing packages, to accommodate virus scanner for R 3.4 (fixed in R 3.5+)***")
# fix problem with tcltk for sqldf package: https://github.com/ggrothendieck/sqldf#problem-involvling-tcltk
options(gsubfn.engine = "R")
message("*** Successfully loaded .Rprofile ***")
Restart R Studio and verify that you see that the messages above are displayed.
Now you can enjoy faster performance of your application on local host, vs. storing the packages on the network and slowing everything down.
I was struggling for a while with this as my work computer (with Windows 10) created the default user library on a network drive, which would slow down R and RStudio to an unusable state.
In case this helps someone, this is the easiest way I found, without requiring admin rights:
make sure the directory you want to install your packages into exists. If you want to respect the convention, use: C:\Users\username\R\win-library\rversion (for example, something like: C:\Users\janebloggs\R\win-library\3.6)
create a .Renviron file in your home directory (which might be on the network drive?), and in it, write one single line that defines the R_LIBS_USER variable to be your custom path:
R_LIBS_USER=C:\Users\janebloggs\R\win-library\3.6
(feel free to add comments too, with lines starting with #)
If a .Renviron file exists, R will read it at startup and use the variables as they are defined in there, before running the code in the .Rprofile. You can read about it in help(Startup).
Now it should be persistent between sessions!
After a couple of hours of trying to solve the issue in several ways, some of which are described here, for me (on Win 10) the option of creating a Renviron file worked, but a little different from what was written here above.
The task is to change the value of the variable R_LIBS_USER. To do this two steps needed:
Create the file named Renviron (without dot) in the folder \Program\etc\ (Program is the directory where R is installed--for example, for me it was C:\Program Files\R\R-4.0.0\etc)
Insert a line in Renviron with new path: R_LIBS_USER = "C:/R/Library"
After that, reboot R and use .libPaths() to confirm the default directory changed.
I think I tried all of the above and it didn't work for me. This worked, though:
In home directory, make a file called ".Renviron"
In that file, write:
.libPaths(new = "/my/path/to/libs")
Save and restart R if you had it open

Issues with changing my default library path in R

I am having an issue changing my default library in R. I previously had a storage space problem on my university laptop and IT amended my user drives which left my previous Rlibrary in limbo somewhere on the previous drive.
I have uninstalled and re-installed R a few times just to be sure, but it remembers my previous library location when I re-install it. And subsequently R struggles to see the packages I need to run my models.
I have subsequently tried (unsuccessfully) to implement the steps in this previous post: Where does R store packages?
The main problem I encounter is when I attempt to edit the R.profile.site file using VIM it says E12: Rprofile.site: Can't open file for writing. I have also tried the same edit in Notepad ++ which doesn't work either. I am no computer programmer so perhaps there is a step I am missing here?
What I really want is one repository for my library. I would be happy to simply remove [1] below as this is the now defunct drive.
My current library paths are:
.libPaths()
[1] "\\studenthome.qut.edu.au/group05$/n2559005/Documents/R/win-library/3.1"
[2] "C:/Program Files/R/R-3.1.2/library"
The issue your're running into is that windows has special permissions for subdirectories of C:/Program Files/. You may be able to edit the site profile by opening it in R using the 'Open Script' option in the the File menu.
Incidentally, you can implement the same solution by creating a .Rprofile file here:
path.expand('~/.Rprofile')
and placing your call to .libPaths( "/my/favorite/directory" ) in that file. In addition, you can define a function like
.First <- function(){
if( interactive() ){
cat("\nWelcome",Sys.info()['login'],"at", date(), "\n")
if('fortunes' %in% utils::installed.packages()[,1] )
print(fortunes::fortune())
}
}
in your .Rprofile file, and if you get your fortune at startup, you know that the correct file was sourced at startup. See ?Startup (specifically the third paragraph) for details.

Resources