Say I use the function install_git from library devtools to install a particular branch of a package e.g. from the install_git documentation:
install_git("git://github.com/hadley/stringr.git", branch = "stringr-0.2")
Is there a way later on to find out if a branch was installed and if so which one? I can use packageVersion() to find the version of the package installed, but this does not give me any additional information about branches referenced.
I have found the answer - the information is contained in the output of the function packageDescription('package_name') e.g.
packageDescription('mlr')
The output is fairly verbose, but the following items within that output hold the answer:
RemoteRepo: mlr
RemoteRef: mindepth_order
GithubRepo: mlr
GithubRef: mindepth_order
This shows that package mlr is loaded, referencing the branch mindepth_order.
Related
I am trying to use the downscaleR package to downscale future precipitation data, and I notice that the downscaleR package provides different methods to achieve this.
https://github.com/SantanderMetGroup/downscaleR
When I try to follow the delta method example, I notice the error "could not find function "quickDiagnostics".
https://github.com/SantanderMetGroup/downscaleR/wiki/calibration-and-cross-validation
In addition, I have followed the example "Perfect Prognosis Approach: Application to Seasonal Forecasts" available at http://meteo.unican.es/work/downscaler/wiki/html/PPapplycationSeasonalForecast.html. However, the “loadMultiField" and "plotMeanField" functions are not available.
I have loaded the packages downscaleR, transformeR, visualizeR, loadeR, climate4R.UDG, and climate4R.datasets. I wonder if the above functions have been excluded from the package, or do I need to install another package? Thanks for any help.
Looking at the GitHub commit history:
remove function quickDiagnostics and dependencies, committed on May 25, 2020
https://github.com/SantanderMetGroup/downscaleR/commit/f85ad1533d3f8579d6072e35b880b82de6c0408e
Title should be pretty clear I hope. I'm writing a package called forecasting, with imports for dplyr among other packages. With the imports written in to the DESCRIPTION file, I am able to force these other packages to be installed along with forecasting - is there an equivalent way to do this for the loading of the package? In other words, is there a way that when I load my package with library(forecasting), it automatically also loads dplyr and the other packages?
Thanks
Yes.
Re-read "Writing R Extensions". The Depends: forces both the initial installation as well as the loading of the depended-upon packages.
But these days you want Imports: along with importFrom() in the NAMESPACE file which is more fine-grained.
But first things first: get it working with Depends.
Edit:
Opps you're correct, the documentation I referenced is not a primary source. Perhaps this is better:
From the R documentation:
The ‘Depends’ field gives a comma-separated list of package names which this package depends on. Those packages will be attached before the current package when library or require is called.
and
The ‘Imports’ field lists packages whose namespaces are imported from (as specified in the NAMESPACE file) but which do not need to be attached. Namespaces accessed by the ‘::’ and ‘:::’ operators must be listed here, or in ‘Suggests’ or ‘Enhances’
Original:
From the R packages documentation:
Adding a package dependency here [the DESCRIPTION file] ensures that it’ll be installed. However, it does not mean that it will be attached along with your package (i.e., library(x)). The best practice is to explicitly refer to external functions using the syntax package::function(). This makes it very easy to identify which functions live outside of your package. This is especially useful when you read your code in the future.
Suppose I'm currently developing a package called mypackage. As time goes by, many different functions have landed in there, and I want to reorganize it. So I'd like to create a new package called newpackage in which I would move some of the functions of mypackage (and include new ones later).
The problem is that I don't want original users of mypackage to get object not found errors when they want to use one of the moved functions.
So, I thought about doing the following :
create newpackage and move the functions
add into mypackage DESCRIPTION file : Depends: newpackage
As such, when people would install, upgrade or load mypackage, newpackage would be installed or loaded too, and all the functions would be available.
Do you think it would work, or would there be some problems I don't think about ?
Thanks !
Isn't it so that it is not recommended to remove functions from a package without labeling them first to be depreciated?! So, maybe you proceed as you planned but before removing them from the mypackage, you could first mark them there as depreciated and then remove them from it finally in the next version of the package. And during the migrating phase you could use the namespace of the packages to refer already to the function in newpackage as you planned.
My the R package depends upon other package (for example "fields")
What is best practice to ensure that the package is loaded, when my package is loaded.
Should I write cover r program to do this ? Can or should such dependencies distributed with my distribution ?
I will appreciate a detail answer with scrips
Edit:
As per following suggestion I added the following in Discription file.
Depends: R (>= 1.8.0), fields
Still the fields package is not loaded automatically when I load my package.
This is something you specify in your DESCRIPTION file that you ship with your package. You can use either the 'Depends' field, or better is to use 'Imports' field in combination with a NAMESPACE file. Have a look at the DESCRIPTION and NAMESPACE files from some other packages, or read over the Writing R Extensions manual.
I'm writing a vignette for one of my packages.
In this vignette, I would like to demonstrate how this package can interact with other packages that are not being imported by the NAMESPACE or by the Imports section of the DESCRIPTION file.
So, I'm putting a require call to use these external packages in my vignette, but of course I got the following NOTE when I try to R CMD check the package:
* checking for unstated dependencies in vignettes ... NOTE
‘library’ or ‘require’ call not declared from: ‘RColorBrewer’
Is there any way around this, or should I either import these external packages or "fake" the vignette using eval=FALSE?
Put it in Suggests: of your DESCRIPTION file.
From p. 6 of the R extensions manual:
The ‘Suggests’ field uses the same syntax as ‘Depends’ and lists
packages that are not necessarily needed. This includes packages used
only in examples, tests or vignettes (see Section 1.4 [Writing package
vignettes], page 26), and packages loaded in the body of functions.
E.g., suppose an example from package foo uses a dataset from package
bar. Then it is not necessary to have bar use foo unless one wants to
execute all the examples/tests/vignettes: it is useful to have bar,
but not necessary. Version requirements can be specified, and will be
used by R CMD check.
In addition if the vignette properly depends on that package, there should be a
% \VignetteDepends{...}
statement in the vignette itself: Sweave, Part II: Package Vignettes, R News 3/2 (Oct. 2003), 21 - 24.
However, your case possibly is a bit different:
I use if (require ("pkgxy")) without % \\VignetteDepends{pkgxy} (Suggests: pkgxy in the DESCRIPTION is needed anyways) for some things I want to show but where I don't want to force the user to have all the suggested pacakges installed. I put a box at the beginning of the vignette where I report which of those packages are available and if a package is not available when the vignette is built, an "pkgxy is needed to do this" text is put into the vignette.
The "introduction" vignette of package hyperSpec is an example (to find out how it actually works, you need not only the .Rnw but also some more definitions).