I have solved variants of the "no visible binding" notes that one gets when checking their package. However, I am unable to solve the situation when applied to the '<<-' assignment.
Specifically, I had defined and used a local variable in several functions, such as:
fName = function(df){
eval({vName<<-0}, envir=environment(fName))
}
However when I run check() from devtools, I get the error:
fName: no visible binding for '<<-' assignment to 'vName'
So, I tried using a different syntax, as follows:
fName = function(df){
assign("vName",0, envir=environment(fName))
}
But got the check() error:
cannot add bindings to a locked environment
When I tried:
fName = function(df){
assign("vName",0, envir=environment(-1))
}
I got the error:
use of NULL environment is defunct
So, my question is how I can accomplish the assignment operator <<- without getting a note in the check() from devtools.
Thank you.
The easy answer is - Don't use <<- in your package. One Alterna way you can make assignments to an environment (but not a meaningful one) is to create a locked binding.
e <- new.env()
e$vName <- 0L
lockBinding("vName", e)
vName
# Error: object 'vName' not found
with(e, vName)
# [1] 0
e$vName <- 5
# Error in e$vName <- 5 : cannot change value of locked binding for 'vName'
You can also lock an environment, but not a meaningful one.
lockEnvironment(e)
rm(vName, envir = e)
# Error in rm(vName, envir = e) :
# cannot remove bindings from a locked environment
Have a look at help(bindenv), it's a good read.
Updated Since you mentioned you might be waiting to make assignment rather than at load times, have a read of help(globalVariables) It's another best-seller at ?
For globalVariables, the names supplied are of functions or other objects that should be regarded as defined globally when the check tool is applied to this package. The call to globalVariables will be included in the package's source. Repeated calls in the same package accumulate the names of the global variables.
i dont know if it helps after that many years.I had the same problem,and what i did to solve it is :
utils::globalVariables(c("global_var"))
Write this r code somewhere inside the R directory(save it as R file).Whenever you assign a global var assign it also locally like so:
global_var<<-1
global_var<-1
That worked for me.
Related
Preamble: package structure
I have an R package that contains an R/globals.R file with the following content (simplified):
utils::globalVariables("COUNTS")
Then I have a function that simply uses this variable. For example, R/addx.R contains a function that adds a number to COUNTS
addx <- function(x) {
COUNTS + x
}
This is all fine when doing a devtools::check() on my package, there's no complaining about COUNTS being out of the scope of addx().
Problem: writing a unit test
However, say I also have a tests/testthtat/test-addx.R file with the following content:
test_that("addition works", expect_gte(fun(1), 1))
The content of the test doesn't really matter here, because when running devtools::test() I get an "object 'COUNTS' not found" error.
What am I missing? How can I correctly write this test (or setup my package).
What I've tried to solve the problem
Adding utils::globalVariables("COUNTS") to R/addx.R, either before, inside or after the function definition.
Adding utils::globalVariables("COUNTS") to tests/testthtat/test-addx.R in all places I could think of.
Manually initializing COUNTS (e.g., with COUNTS <- 0 or <<- 0) in all places of tests/testthtat/test-addx.R I could think of.
Reading some examples from other packages on GitHub that use a similar syntax (source).
I think you misunderstand what utils::globalVariables("COUNTS") does. It just declares that COUNTS is a global variable, so when the code analysis sees
addx <- function(x) {
COUNTS + x
}
it won't complain about the use of an undefined variable. However, it is up to you to actually create the variable, for example by an explicit
COUNTS <- 0
somewhere in your source. I think if you do that, you won't even need the utils::globalVariables("COUNTS") call, because the code analysis will see the global definition.
Where you would need it is when you're doing some nonstandard evaluation, so that it's not obvious where a variable comes from. Then you declare it as a global, and the code analysis won't worry about it. For example, you might get a warning about
subset(df, Col1 < 0)
because it appears to use a global variable named Col1, but of course that's fine, because the subset() function evaluates in a non-standard way, letting you include column names without writing df$Col.
#user2554330's answer is great for many things.
If I understand correctly, you have a COUNTS that needs to be updateable, so putting it in the package environment might be an issue.
One technique you can use is the use of local environments.
Two alternatives:
If it will always be referenced in one function, it might be easiest to change the function from
myfunc <- function(...) {
# do something
COUNTS <- COUNTS + 1
}
to
myfunc <- local({
COUNTS <- NA
function(...) {
# do something
COUNTS <<- COUNTS + 1
}
})
What this does is create a local environment "around" myfunc, so when it looks for COUNTS, it will be found immediately. Note that it reassigns using <<- instead of <-, since the latter would not update the different-environment-version of the variable.
You can actually access this COUNTS from another function in the package:
otherfunc <- function(...) {
COUNTScopy <- get("COUNTS", envir = environment(myfunc))
COUNTScopy <- COUNTScopy + 1
assign("COUNTS", COUNTScopy, envir = environment(myfunc))
}
(Feel free to name it COUNTS here as well, I used a different name to highlight that it doesn't matter.)
While the use of get and assign is a little inconvenient, it should only be required twice per function that needs to do this.
Note that the user can get to this if needed, but they'll need to use similar mechanisms. Perhaps that's a problem; in my packages where I need some form of persistence like this, I have used convenience getter/setter functions.
You can place an environment within your package, and then use it like a named list within your package functions:
E <- new.env(parent = emptyenv())
myfunc <- function(...) {
# do something
E$COUNTS <- E$COUNTS + 1
}
otherfunc <- function(...) {
E$COUNTS <- E$COUNTS + 1
}
We do not need the get/assign pair of functions, since E (a horrible name, chosen for its brevity) should be visible to all functions in your package. If you don't need the user to have access, then keep it unexported. If you want users to be able to access it, then exporting it via the normal package mechanisms should work.
Note that with both of these, if the user unloads and reloads the package, the COUNTS value will be lost/reset.
I'll list provide a third option, in case the user wants/needs direct access, or you don't want to do this type of value management within your package.
Make the user provide it at all times. For this, add an argument to every function that needs it, and have the user pass an environment. I recommend that because most arguments are passed by-value, but environments allow referential semantics (pass by-reference).
For instance, in your package:
myfunc <- function(..., countenv) {
stopifnot(is.environment(countenv))
# do something
countenv$COUNT <- countenv$COUNT + 1
}
otherfunc <- function(..., countenv) {
countenv$COUNT <- countenv$COUNT + 1
}
new_countenv <- function(init = 0) {
E <- new.env(parent = emptyenv())
E$COUNT <- init
E
}
where new_countenv is really just a convenience function.
The user would then use your package as:
mycount <- new_countenv()
myfunc(..., countenv = mycount)
otherfunc(..., countenv = mycount)
Suppose there is a set of functions, drawn from a package not written by me, that I want to assign to a special behavior on error. My current concern is with the _impl family of functions in dplyr. Take mutate_impl, for example. When I get an error from mutate, traceback almost always leads me to mutate_impl, but it is usually a ways up the call stack -- I have seen it be as many as 15 calls from the call to mutate. So what I want to know at that point is typically how the arguments to mutate_impl relate to the arguments I originally supplied to mutate (or think I did).
So, this code is probably wrong in too many ways to count -- certainly it does not work -- but I hope it at least helps to express my intent. The idea is that I could wrap it around mutate_impl, and if it produces an error, it saves the error message and a description of the arguments and returns them as a list
str_impl <- function(f){tryCatch(f, error = function(c) {
msg <- conditionMessage(c)
args <- capture.output(str(as.list(match.call(call(f)))))
list(message = msg, arguments = args)
}
assign(str_impl(mutate_impl), .GlobalEnv)
Although, this still falls short of what I really want, because even without the constraint of working code, I could not figure out how to produce a draft. What I really want is to be able to identify a function or list of functions that I want to have this behavior on error, and then have it occur on error whenever and wherever that function is called. I could not think of any way to even start to do that without rewriting functions in the dplyr package environment, which struck me as a really bad idea.
The final assignment to the global environment is supposed to get the error object back to somewhere I can find it, even if the call to mutate_impl happens somewhere inaccessible, like in an environment that ceases to exist after the error.
Probably the best way of achieving what you want is via the trace functionality. It's surely worth reading the help about trace, but here is a working example:
library(dplyr)
trace("mutate_impl", exit = quote({
if (class(returnValue())[1]=="NULL") {
cat("df\n")
print(head(df))
cat("\n\ndots\n")
print(dots)
} else {
# no problem, nothing to do
}
}), where = mutate, print = FALSE)
# ok
xx <- mtcars %>% mutate(gear = gear * 2)
# not ok, extra output
xx <- mtcars %>% mutate(gear = hi * 2)
It should be fairly simple to adjust this to your specific needs, e.g. if you want to log to a file instead:
trace("mutate_impl", exit = quote({
if (class(returnValue())[1]=="NULL") {
sink("error.log")
cat("df\n")
print(head(df))
cat("\n\ndots\n")
print(dots)
sink()
} else {
# no problem, nothing to do
}
}), where = mutate, print = FALSE)
I seem to be having the same issue as is seen here so I started checking the environments that my frames/matrices are in. I have a character matrix, and a table that was imported as a list. I have been able to create a user-defined function that I have debugged and I can confirm that it runs through step by step assigning values in the character matrix to those needing change in the list.
{
i = 1
j = NROW(v)
while (i < j) {
if (v[i] %in% Convert[, 1]) {
n <- match(v[i], Convert[, 1])
v[i] <- Convert[n, 2]
}
i = i + 1
}
}
That is the code in case you need to see what I am doing.
The problem is whenever I check the environment of either of the list or the matrix, I get NULL (using environment()). I tried using assign() to create a new matrix. It seems, based on the link above, that this is an environment issue, but if the lists/matrices used have no environment, what is one to do?
Post Note: I have tried converting these to different formats (using as.character or as.list), but I don't know if this is even relevant if I can't get the environment issue resolved above.
environment() works only for functions and not for variables.
In fact the environment function gives the enclosing environment that makes sense only for function and not the binding environment you are interested in case of a variable.
If you want the binding environment use pryr package
library(pryr)
where("V")
Here an example
e<-new.env()
e$test<-function(x){x}
environment(e$test)
yu can see the environment here is the global environment because you defined the function there, but obviously the binding environment(that is the environment where you find the name of the variabile) is e.
http://adv-r.had.co.nz/Environments.html
Here to understand better the problem
I'm trying to write functions that use data.table methods to add and edit columns by reference. However, the same code that work in the console does not work when called from within a function. Here is a simple examples:
> dt <- data.table(alpha = c("a","b","c"), numeric = 1:3)
> foo <- function(x) {
print(is.data.table(x))
x[,"uppercase":=toupper(alpha)]
}
When I call this function, I get the following error.
> test = foo(dt)
[1] TRUE
Error in `:=`("uppercase", toupper(alpha)) :
Check that is.data.table(DT) == TRUE. Otherwise, := and `:=`(...) are
defined for use in j, once only and in particular ways. See help(":=").
Yet if I type the same code directly into the console, it works perfectly fine.
> dt[,"uppercase":=toupper(alpha)]
> dt
alpha numeric uppercase
1: a 1 A
2: b 2 B
3: c 3 C
I've scoured stackoverflow and the only clues I could find suggest that the function might be looking for alpha in a different environment or parent frame.
EDIT: More information to reproduce the error. The function is not within a package, but declared directly in the global environment. I didn't realize this before, but in order to reproduce the error I need to dump the function to a file, then load it. I've saved all of my personal functions via dput and load them into R with dget, so that's why I get this error often.
> dput(foo, "foo.R")
> foo <- dget("foo.R")
> foo(dt)
Error in `:=` etc...
This problem is a different flavor of the one described in the post Function on data.table environment errors. It's not exactly a problem, just how dget is designed. But for those curious, this happens because dget assigns the object to parent environment base, and the namespace base isn't data.table aware.
If x is a function the associated environment is stripped. Hence scoping information can be lost.
One workaround is to assign the function to the global enviornment:
> environment(foo) <- .GlobalEnv
But I think the best solution here is to use saveRDS to transfer R objects, which is what ?dget recommends.
I use some user-defined small functions as helpers. These functions are all stored in a R_HOME_USER/helperdirectory. Until now, these functions were sourced at R start up. The overall method is something like `lapply(my.helper.list,source). I want now these functions to be sourced but not to appear in my environment, as they pollute it.
A first and clean approach would be to build a package with all my helper.R. For now, I do not want to follow this method. A second approach would be to name these helpers with a leading dot. This annoys me to have to run R > .helper1().
Best way would be to define these helpers in a specific and accessible environment, but I am messing with the code. My idea is to create first a new environment:
.helperEnv <- new.env(parent = baseenv())
attach(.helperEnv, name = '.helperEnv')
Fine, R > search() returns 'helperEnv' in the list. Then I run :
assign('helper1', helper1, envir = .helperEnv)
rm(helper1)
Fine, ls(.helperEnv)returns 'helper1' and this function does not appear anymore in my environment.
The issue is I can't run helper1 (object not found). I guess I am not on the right track and would appreciate some hints.
I think you should assign the pos argument in your call to attach as a negative number:
.helperEnv <- new.env()
.helperEnv$myfunc<-function(x) x^3+1
attach(.helperEnv,name="helper",pos=-1)
ls()
#character(0)
myfunc
#function(x) x^3+1