Load library into existing environment (equivalent to "local" parameter of "source")? - r

I am sourcing util functions in production into an environment to encapsulate (and group) the helper functions:
Helper file:
# File: Helper.R
hello <- function() {
print("Hello world")
}
Client:
helper <- new.env()
source("Helper.R", local=helper)
helper$hello() # call the helper function
How can I migrate my sourced "Helper.R" into a library without breaking the calls of the sourced functions?
What I want is something like
helper <- new.env()
library(Helper, local=helper)
helper$hello() # call the helper function (loaded from the library now)
Is there a way to do this?

You can use the box::use() function in the ‘box’ package.
Then the following attaches a package locally:
box::use(pkg[...])
Alternatively, and potentially closer to what you actually want to do, you could use the following simpler code:
box::use(pkg)
Now the package is not attached at all, and its exported objects can be accessed via pkg$obj. This is somewhat similar to base R’s loadNamespace function, but does considerably more behind the scenes.
Finally, consider not putting your helper code into a package at all, but rather distributing it as a module. This is after all what the ‘box’ package was designed for: instead of creating a package, just distribute your helper.r code file (or folder) and then use it as follows:
box::use(./helper)
See the package website and vignette for a detailed description.

Adding to the list of suggestions you can also consider to use the modules package on CRAN (note I am the author). When you have your Helper.R file with:
hello <- function() {
print("Hello world")
}
you can use
helper <- modules::use("Helper.R")
helper$hello()
to encapsulate your helper functions in their own environment. The package also provides some functions to manipulate the local namespace of a module (imports/exports).

I have found another package called import (similar to "modules") that allows importing packages into an environment:
https://github.com/smbache/import
This package is also on CRAN:
install.packages("import")
It allows importing selected, all exported ("public") or all (even not exported) functions of a package.
Example:
import::from(dplyr, arrange, .into = "datatools")
The import::from function is a convenient wrapper around getExportedValue.

Another way could be:
# getNamespace returns the environment representing the name space name. The namespace is loaded if necessary.
# Internal function to support reflection on namespace objects!
env <- getNamespace("data.table")
cars <- env$as.data.table(mtcars)
This examples makes all objects of the package data.table available via the environment variable env.
Note: It uses an internal R function (I have no idea how big the risk of changes of this internal function really is).

Related

R doesn't recognise my functions updates within another function

I am working on a package and currently it has a lot of functions. Inorder to load them every time I open up RStudio, I use this line of code from devtools:
library(devtools)
suppressMessages(load_all("~/Codes/package1/"))
It works fine, but the problem is whenever I change a function that has been used in another function, R doesn't recognize the changes.
For Example if I have:
func1 <- function() {
print("version1")
}
func2 <- function() {
func1()
}
And then change func1 to print("Vesion2"), rerun it and then run func2, it would still print version1 for me.
Anyone knows whats the problem and how can I solve it?
The devtools load_all function simulates loading a package. All functions from a package are stored in a package namespace. Functions remember what namespace they come from via their environment().
Any code you run in the console runs in the global environment. So when you run
func1 <- function() {print("version2")}
you are creating a new function called func1 in your global environment but the func1 from the package namespace is still there. You've created a "shadow" function that masks the original function.
When you got to run func2 which is still in the package namespace, it sees a call to a function named func1. When it goes to look for this function, it looks first in it's own namespace due to R's lexical scoping rules. It finds the original funct1 and not the one you created in the global environment so it runs that.
Packages generally aren't meant to have their functions swapped or altered after they are loaded. You would save to save the source and call load_all to reload that folder as a package with the new changes. If you aren't really trying to simulate a package, importing functions with source() will not create a new namespace and would therefore be easier to edit after import.

Self-written R package does not find its own function

I created a package with some functions which are helpful at my company. Recently, I restructered the package such that there are helper functions which need not to be accessible for everyone, but are called internally from other (exported) functions of the package. These helper functions are not exported to the namespace (no #' #export in the respective .R files).
Now, when I call one of the "major" (exported) functions, I get the error message (no real function names):
Error in major_function() : could not find function "helper_function"
Im fairly new in building packages, but from what I understood so far (from https://cran.r-project.org/web/packages/roxygen2/vignettes/namespace.html), it should neither be necessary to export the helper functions, nor to add #' importFrom my_package helper_function to the .R file of the major function.
When I tried this, it actually produced errors when checking the package. I also tried to call the helper functions with my_package:::helper_function, but this lead to the note that it should almost never be necessary to call functions from the same package like this.
Maybe useful information:
The error occurs only when I call a major_function_1 which internally calls major_function_2 which calls a helper_function.
I think there is more to your problem than what you state. As long as all your functions are defined in the same namespace (this also means that all your functions need to live in .R files in the same folder), the calling function should find the helper-functions accordingly.
I suspect you have your helper functions nested in some way, and that is causing the problem.
I recommend to recheck your namespace structure, or post a simplistic outline of your package here.
Another reason that could come to mind, is that you do not export your 'mayor_function2' in your NAMESPACE-file in your package root (maybe you have not recompiled the Roxygen documentation generating this file), and additionally have a local shadow of the the calling function 'mayor_function1'. Try to check this and rerun from a clean compile.

R with roxygen2: How to use a single function from another package?

I'm creating an R package that will use a single function from plyr. According to this roxygen2 vignette:
If you are using just a few functions from another package, the
recommended option is to note the package name in the Imports: field
of the DESCRIPTION file and call the function(s) explicitly using ::,
e.g., pkg::fun().
That sounds good. I'm using plyr::ldply() - the full call with :: - so I list plyr in Imports: in my DESCRIPTION file. However, when I use devtools::check() I get this:
* checking dependencies in R code ... NOTE
All declared Imports should be used:
‘plyr’
All declared Imports should be used.
Why do I get this note?
I am able to avoid the note by adding #importFrom dplyr ldply in the file that is using plyr, but then I end but having ldply in my package namespace. Which I do not want, and should not need as I am using plyr::ldply() the single time I use the function.
Any pointers would be appreciated!
(This question might be relevant.)
If ldply() is important for your package's functionality, then you do want it in your package namespace. That is the point of namespace imports. Functions that you need, should be in the package namespace because this is where R will look first for the definition of functions, before then traversing the base namespace and the attached packages. It means that no matter what other packages are loaded or unloaded, attached or unattached, your package will always have access to that function. In such cases, use:
#importFrom plyr ldply
And you can just refer to ldply() without the plyr:: prefix just as if it were another function in your package.
If ldply() is not so important - perhaps it is called only once in a not commonly used function - then, Writing R Extensions 1.5.1 gives the following advice:
If a package only needs a few objects from another package it can use a fully qualified variable reference in the code instead of a formal import. A fully qualified reference to the function f in package foo is of the form foo::f. This is slightly less efficient than a formal import and also loses the advantage of recording all dependencies in the NAMESPACE file (but they still need to be recorded in the DESCRIPTION file). Evaluating foo::f will cause package foo to be loaded, but not attached, if it was not loaded already—this can be an advantage in delaying the loading of a rarely used package.
(I think this advice is actually a little outdated because it is implying more separation between DESCRIPTION and NAMESPACE than currently exists.) It implies you should use #import plyr and refer to the function as plyr::ldply(). But in reality, it's actually suggesting something like putting plyr in the Suggests field of DESCRIPTION, which isn't exactly accommodated by roxygen2 markup nor exactly compliant with R CMD check.
In sum, the official line is that Hadley's advice (which you are quoting) is only preferred for rarely used functions from rarely used packages (and/or packages that take a considerable amount of time to load). Otherwise, just do #importFrom like WRE advises:
Using importFrom selectively rather than import is good practice and recommended notably when importing from packages with more than a dozen exports.

How to use S3 methods from another package which uses export rather than S3method in its namespace without using Depends or library()

I'm working on an R package at present and trying to follow the best practice guidelines provided by Hadley Wickham at http://r-pkgs.had.co.nz. As part of this, I'm aiming to have all of the package dependencies within the Imports section of the DESCRIPTION file rather than the Depends since I agree with the philosophy of not unnecessarily altering the global environment (something that many CRAN and Bioconductor packages don't seem to follow).
I want to use functions within the Bioconductor package rhdf5 within one of my package functions, in particular h5write(). The issue I've now run into is that it doesn't have its S3 methods declared as such in its NAMESPACE. They are declared using (e.g.)
export(h5write.default)
export(h5writeDataset.matrix)
rather than
S3method(h5write, default)
S3method(h5writeDataset, matrix)
The generic h5write is defined as:
h5write <- function(obj, file, name, ...) {
res <- UseMethod("h5write")
invisible(res)
}
In practice, this means that calls to rhdf5::h5write fail because there is no appropriate h5write method registered.
As far as I can see, there are three solutions to this:
Use Depends rather than Imports in the DESCRIPTION file.
Use library("rhdf5") or require("rhdf5") in the code for the relevant function.
Amend the NAMESPACE file for rhdf5 to use S3methods() rather than export().
All of these have disadvantages. Option 1 means that the package is loaded and attached to the global environment even if the relevant function in my package is never called. Option 2 means use of library in a package, which while again attaches the package to the global environment, and is also deprecated per Hadley Wickham's guidelines. Option 3 would mean relying on the other package author to update their package on Bioconductor and also means that the S3 methods are no longer exported which could in turn break other packages which rely on calling them explicitly.
Have I missed another alternative? I've looked elsewhere on StackOverflow and found the following somewhat relevant questions Importing S3 method from another package and
How to export S3 method so it is available in namespace? but nothing that directly addresses my issue. Of note, the key difference from the first of these two is that the generic and the method are both in the same package, but the issue is the use of export rather than S3method.
Sample code to reproduce the error (without needing to create a package):
loadNamespace("rhdf5")
rdhf5::h5write(1:4, "test.h5", "test")
Error in UseMethod("h5write") :
no applicable method for 'h5write' applied to an object of class
"c('integer', 'numeric')
Alternatively, there is a skeleton package at https://github.com/NikNakk/s3issuedemo which provides a single function demonstrateIssue() which reproduces the error message. It can be installed using devtools::install_github("NikNakk/s3issuedemo").
The key here is to import the specific methods in addition to the generic you want to use. Here is how you can get it to work for the default method.
Note: this assumes that the test.h5 file already exists.
#' #importFrom rhdf5 h5write.default
#' #importFrom rhdf5 h5write
#' #export
myFun <- function(){
h5write(1:4, "test.h5", "test")
}
I also have put up my own small package demonstrating this here.

How to define which variables or functions from a package are exported

My R package uses an internal variable x. If I load the package (I've only tried using devtools::load_all), then x doesn't appear in the ls() list, but it does have a value. How can I avoid this?
I'm fine with the user being able to access the variable with myPackage::x, but not simply x.
The load_all function has an export_all argument.
From ?load_all
If TRUE (the default), export all objects. If FALSE, export only the objects that are listed as exports in the NAMESPACE file.
So, try using export_all=FALSE in your load_all call.
Try building the package first, and check whether the problem still exists. The exports from a package are defined in the NAMESPACE file. When you use devtools::load_all, the namespace isn't loaded ( see here). Read more about this and building a package in the manual Writing R extensions.
You might be using a default export pattern in your NAMESPACE file. Check it in your package, and if it looks like this:
exportPattern("^[^\\.]")
then the package exports everything from the namespace that doesn't start with a dot. So you either call it .x, or you change the exportPattern() to eg...
export(myfun1, myfun2)
to export the functions myfun1 and myfun2 from the package. By explicitly defining what you want to export, you avoid that something's available when there's no need for that.

Resources