I'm at a loss as to how to install the RBloomberg package. The only source for the files seems to be GitHub. The offered zip file is called blpwrapper-master.zip which embeds an rbloomberg folder. When I try to install the zip file (in RStudio), I get an error message that it cannot open a compressed file. I rezipped just the rbloomberg folder, but that led nowhere either. How does one go about this?
In general things can be installed from Github using the devtools package. For example:
library("devtools")
install_github("username/packagename")
I don't know who authors so Rbloomberg, but you can swap in the appropriate Github username in the above.
Note: Sometimes this won't work because a developer uses a non-traditional .git directory structure, but it should work in most cases. Indeed, that is the case here (as #rawr points out), where you need to use a slightly different path to point to package (which is in a subdirectory of the git repo):
install_github("johnlaing/blpwrapper/rbloomberg")
Related
I work in a corporate environment that uses Microsoft Windows Defender Application Control (WDAC) to provide security. This blocks unsigned EXE and DLL files from being installed on devices. R packages which use DLLs fail to install. The workaround to this is provide an R installation from an approved central source which also copies over a default set of packages, such as tidyverse, data.table etc. to the R library. Users can continue to install additional packages which are built with native R, but run into issues if they try to install, build from source, or update packages with DLL files in.
Is there a way to check whether a package uses DLL files in advance of installation?
Output something like:-
check_dll(foo)
result: "This package and its dependencies have no DLL files. You can install this package"
check_dll(bar)
result: "bar does not have any DLL files, but one dependency, OOF, uses DLL files.
You have already have a version of OOF installed so it should be safe to install bar"
check_dll(foobar)
result: "foobar has a DLL. Do not attempt to install foobar".
check_dll(RABOOF)
result: "RABOOF does not have any DLL files, but one of it's dependencies,
foobar, does have a DLL file. Do not attempt to install RABOOF".
tools::package_dependencies() will list the package dependencies, but nothing else.
Downloading the zip file from CRAN and inspecting it for a libs/x64 folder with contents will work, but seems a heavyweight approach. Theoretically if a package has lots of dependencies this could result in downloading a lot of files unnecessarily.
Look for the NeedsCompilation field in the DESCRIPTION file. If it is "yes", there will be a DLL. If it is "no", there probably won't be. (If it is not there, the package wasn't built properly, so all bets are off.)
The test is not perfect, because packages can put DLLs into the inst folder to get them installed without compiling them, though CRAN isn't supposed to allow that: "Source packages may not contain any form of binary executable code." But packages like pak (mentioned in the comments) may be allowed to get around this rule, e.g. by downloading binaries, so the test isn't perfect. You will also need to put together a blacklist of packages that will fail your WDAC tests even though they claim not to need compilation, containing pak and others like it.
The NeedsCompilation field is included as a column of the result of available.packages(), so it is very easy to access without trying to install the package.
I have accepted the answer from user2554330 as the best solution. It makes use of the normal set of commands used for package management; and the matrix generated by available.packages() can be passed to tools::package_dependencies(), removing the need for multiple internet queries.
For completeness I am documenting another possible solution. A script could query the unofficial CRAN Github mirror https://docs.r-hub.io/#cranatgh and look for a /src directory in each package project.
My package is hosted in github, and user can install it through devtools::install_github.
Now I'm using pkgdown to generate documentation site, which created a 10M docs folder. Then I found devtools::install_github always download the whole master zip ball which become quite slow.
I tried to exclude the docs folder with these attempts:
.Rbuildignore, turned out it's only about bundled package, while install_github is installing source package so it doesn't work.
put package in pkg folder, put the generated docs folder out of pkg folder. However the whole master zip ball is always downloaded, even with subdir = "pkg" specified.
put development in a branch, and to create a special package branch without docs folder. Merge two branch but let package branch exclude docs folder. I tried make .gitignore to be branch specific but it doesn't seem to work. This seemed to be impossible.
My newest attempt is to create a separate repo solely for the website, just let pkgdown create the website in that folder like build_site(path = "../docsite/docs"). This should solve the problem and is simple and clean. The only imperfection is the website url will not be the usually pattern.
EDIT: with the latest version of pkgdown, there is no path parameter anymore, you need to specify it in the site configuration yaml, which works better (you don't need to specify it in every command).
I have a computer behind a very restrictive proxy server it only allows me to surf the web and download programs it does not allow programs like the Atom text editor to download it's packages.
My question is how do I install them using only browser based downloads?
It is certainly possible:
Find the package you want to install, for example the activate-power-mode package.
Click on the Repo button to go to the GitHub repository.
Click Releases towards the top of the UI, then click on the most recent release, 0.4.1 in this case.
Download the source code release in either Zip or GZip depending on your platform.
Extract the content of the archive to a known permanent location, I have chosen:
C:\Source\Atom
Run the following command from your terminal / command prompt (make sure to include quotes around the path):
apm link "C:\Source\Atom\activate-power-mode-0.4.1"
Restart or Reload Ctrl-Alt-R Atom and the package will now be installed.
You can alternatively extract the package directly to your ~/.atom/packages folder however you will have to rename the folder to exactly match the name of the package, additionally uninstalling the package from Atom will delete the files which could be annoying if it is an accidental deletion.
Because of package dependencies a safest bet is this:
Install package normally on connected computer
Copy contents from your ~/.atom/packages
Paste contents to ~/.atom/packages on offline computer
Restart Atom
At least this worked for me like a charm.
The answer of Richard Slater is informative and the answer of Andriy Buday could look less professional. But, in my case, the answer of Andriy Buday was also very important.
I tried to install two packages atom-beautify and prettier-atom by following the answer of Richard Slater and had some problems of not being able to find some modules. It was not only me who had these problems. Consider checking the following links.
The issue of "cannot find module event-kit"
https://github.com/Glavin001/atom-beautify/issues/1734
https://github.com/Glavin001/atom-beautify/issues/1366#issuecomment-269716306
When I decompressed a file (atom-beautify-0.30.3.tar.gz) I received from GitHub respository, I could find out directories like appveyor, docs, and examples. But I could not find out a directory named node_modules which was present when I installed this package atom-beautify using Atom Editor online.
To check if the absence of directory node_modules is the only problem, I went through the following steps.
Start Atom Editor.
Install atom-beautify using Atom Editor online like the answer of Andriy Buday suggests.
Close Atom Editor.
Move atom-beautify directory from ~/.atom/packages (that was %HOMEDIRECTORY%%HOMEPATH%.atom\packages in my case because I used cmd on Windows 10) to somewhere else.
Decompress atom-beautify-0.30.3.tar.gz and move or copy atom-beautify-0.30.3 directory from this decompressed result into %HOMEDIRECTORY%%HOMEPATH%.atom\packages as the answer of Andriy Buday suggests.
Rename directory %HOMEDIRECTORY%%HOMEPATH%.atom\packages\atom-beautify-0.30.3 to %HOMEDIRECTORY%%HOMEPATH%.atom\packages\atom-beautify as the answer of Richard Slater suggests.
Move or copy node_modules directory from the directory moved at step 4 into %HOMEDIRECTORY%%HOMEPATH%.atom\packages\atom-beautify.
Start Atom Editor.
I found that no error message appeared and that package atom-beautify worked properly, thus I am thinking that absence of node_modules directory was the only problem of the file atom-beautify-0.30.3.tar.gz I received from GitHub repository.
I am afraid if it is normal that directory node_modules is not contained in the file atom-beautify-0.30.3.tar.gz downloaded from GitHub repository because of any rules I do not know yet, like placing directories like node_modules somewhere else. If there really are such rules and somebody tells me about such rules by adding an answer or a comment here, I will appreciate it a lot.
I am not sure if it is same with all other packages, but I found that it was same at least with package prettier-atom.
I wish it helps somebody.
+++++++++++++++++++++++++++
I found why the directory node_modules was not contained in atom-beautify-0.30.3.tar.gz.
I checked answers of the following link.
How can I manually download packages for atom editor and install them (manually)?
Answer by D3181 included a link to a page of http://discuss.atom.io/ (I could get a helpful answer by Alchiadus from the link) and suggested running apm install in the package's directory. If it is necessary to use a file downloaded from GitHub repository like atom-beautify-0.30.3.tar.gz, it is necessary to run apm install in the package's directory before copying or moving into %HOMEDIRECTORY%%HOMEPATH%\.atom\package (~/.atom/package in case of *nix) of the offline computer.
Decompress the file downloaded from GitHub repository like atom-beautify-0.30.3.tar.gz.
Go into the directory like atom-beautify-0.30.3 of the decompressed result.
Run apm install on an online computer. (If the directory of apm.cmd is not in PATH, run {directory of apm.cmd}\apm.cmd install.)
Rename directory like atom-beautify-0.30.3 to the correct name of the package like atom-beautify.
Move directory with the correct name of the package like atom-beautify into %HOMEDIRECTORY%%HOMEPATH%\.atom\packages of the offline computer.
Run Atom Editor on the offline computer and check if the package works properly.
It seems normal that the directory node_modules is not included the the compressed file downloaded from GitHub repository.
I have been using git for a while but just recently started using packrat. I would like my repository to be self contained but at the same time I do not want to include CRAN packages as they are available. It seems once R is opened in a project with packrat it will try to use packages from project library; if they are not available then it will try to install from src in the project library; if they are not available it will look at libraries installed in that computer. If a library is not available in the computer; would it look at CRAN next?
What files should I include in my git repo as a minimum (e.g., packrat.lock)?
You can choose to set an external CRAN-like repository with the source tarballs of the packages and their versions that you'd like available for your project. The default behaviour, however, is to look to CRAN next, as you've identified in your question. Check out the packrat.lock file, you will see that for each package you use in packrat, there is an option called source: CRAN (if you've downloaded the file from CRAN, that is).
When you have a locally stored package source file, the contents of the lockout for said package change to the following:
Package: FooPackage
Source: source
Version: 0.4-4
Hash: 44foo4036fb68e9foo9027048d28
SourcePath: /Users/MyName/Documents/code/myrepo/RNetica
I'm a bit unclear on your final question: What files should I include in my git repo as a minimum (e.g., packrat.lock)? But I'm going to take this as a) combination of what files should be present for packrat to run, and b) which of those files should be committed to the git-repo. To answer the first question, I illustrate with initialising packrat on an existing R project.
When you run packrat::init(), two important things happen (among others):
1. All the packrat scaffolding, including source tarballs etc are created under: PackageName/packrat/.
2. packrat/lib*/ is added to your .gitignore file.
So from this, we can see that anything under packrat/lib*/ doesn't need to be committed to your git-repo. This leaves the following 3 files to be committed:
packrat/init.R
packrat/packrat.lock
packrat/packrat.opts
packrat.lock is needed for collaborating with others through a version control system; it helps keep your private libraries in sync. packrat.opts allows you to specify different project specific options for packrat. The file is automatically generated using get_opts and set_opts. Committing this file to the git-repo will ensure that any options you specify are maintained for all collaborators. A final file to be committed to the repo is .Rprofile. This file tells R to use the private package library (when R is started from the project directory).
Depending on your needs, you can choose to commit the source tar balls to the repository, or not. If you don't want them available in your git-repo, you simply add packrat/src/ to the .gitignore. But, this will mean that anyone accessing the git-repo will not have access to the package source code, and the files will be downloaded from CRAN, or from wherever the source line dictates within the packrat.lock file.
From your question, it sounds like committing the packrat/src/ folder contents to your repo might be what you need.
I am using R 2.13.0 with windows 7, after giving my user full privileges to the R folder (as described here).
This allows me to install new packages just fine.
However, when using update.packages(), to update existing packages, I keep getting the following error (for example, when updating the MASS package):
package 'MASS' successfully unpacked and MD5 sums checked
Warning: unable to move temporary installation
'C:\Program
Files\R\R-2.13.0\library\file6cae3bcf\MASS'
to 'C:\Program
Files\R\R-2.13.0\library\MASS'
Any suggestions on how to fix this?
p.s: Running R as an administrator or shifting the library location out of Program Files is not a solution (it's a hack - but I am looking for a solution)
I found that the problem indeed is the antivirus "real time file system protection". I do the following to fix the problem:
trace(utils:::unpackPkgZip, edit=TRUE)
I edit line 140 (line 142 in R 3.4.4):
Sys.sleep(0.5)
to:
Sys.sleep(2)
I seems like the antivirus stalls the creation of the package tmp dir. After changing it to 2 seconds the error is gone.
EDIT: to do this programmatically execute
trace(utils:::unpackPkgZip, quote(Sys.sleep(2)), at = which(grepl("Sys.sleep", body(utils:::unpackPkgZip), fixed = TRUE)))
(credits #DavidArenburg)
Just to update everyone, I (think that I) found out the source of the problem: antivirus.
The "real time file system protection" was blocking R from copying the files between folders once they were downloaded.
Upon adding the R directory to the exception list (coupled with adding user permission and installing R on D:\R), and the problem went away. With all of this work, I might as well switch to Linux (I should, really...)
(I updated my post with the above information: http://www.r-statistics.com/2011/04/how-to-upgrade-r-on-windows-7/)
I hope it will help someone in the future,
Tal
If you cannot turn off your antivirus, due to corporate policy for example, here is a workaround that I found. Debugging the unzip package function and then stepping through it gives the antivirus enough time to do its job without interfering. Use this command:
debug(utils:::unpackPkgZip)
install.packages("packageName")
and then step through the code (by pressing enter many times) when R starts debugging during the installation.
I found this solution here.
If you can just download the binary straight from CRAN. On windows when downloaded it will be a zip file. Now manually unzip this into the ..library/ folder of your R (.libPaths()). It worked for me on some packages.
I had this problem installing both swirl and dplyr. I am working on Windows 64-bit.
Warning: unable to move temporary installation
What I did is I accessed my temporary files on the C: drive, and opened my file extractor program and I extracted the files from the temp file in the C: drive to my R program files in the C: drive, by manually copying them. THIS WORKED FOR BOTH dpylr and swirl. Stoked!
Cheers,
Peach
Can you not use the lib.loc parameter to only update packages in your personal library (in user)?
There should be no way to enable a normal, non-augmented user to change files in the program files folder, so the only thing you can do (if you don't want to augment the user) is to have R not updating packages there.
A workaround is to avoid installing R in the program files folder (which may be more or less of a hack than just shifting the library location out of it, depending on your point of view).
Finally, if lib.loc doesn't cut it, you can look at the source code for update.packages and create your own customized version that will always avoid the common library location in program files.
I just met the same question, and the solution I found out was that you should install packages using the original R software (plus, you should choose the right mirror site, some of them are blocked). At first I used Rstudio to install packages and I got the same problem as you met. Hope this is helpful.
I have run into this error several times. In my own case, it is because our admins want us to use remote virtual disks (on Windows 7) for our files and everything is locked up tight as a drum. The only way I can use R packages is in a lib directory on that remote virtual disk. This wouldn't be a problem except that the network isn't always smooth and fast. Thus, when I need a package, especially one with several other packages in tow (e.g., MBESS), I either have to go through the get.packages() process multiple times until it finally finishes or make it IT's headache to do quick like the bunny for me. I can't always wait for IT.
I just went to the library folder (Windows XP) and deleted all fileXXXX folders. Reran the install an it is worked.
I had the same problem. Since the issue seems to be the antivirus blocking the transf of a downloaded file, I tried a different download method in the install.packages and it worked.
For example:
install.packages("stringr", method = "curl")
You must go into the properties of the R folder and change the security parameters. You can enable the option to write and modify for all users.
The error : "unable to move temporary installation" is basically arising due to any of the antivirus running on your system.
Try unzipping the downloaded file from the Temp folder into the default library path (you can get it by running .libPaths() in R session).
I'm using a MRAN and I was having so many versioning issues. Trying to work with tidyverse and ggplot2 and by upgrading to the latest version from Microsoft it solved all of my R-Studio versioning issues.
Version info:
Microsoft R Open 3.5.1
The enhanced R distribution from Microsoft
Default CRAN mirror snapshot taken on 2018-08-01.
Download Microsoft R Open 3.5.1