I have a simple DT and I would like to add a column to the rest. The code is as follows: (works)
x <- data.table(a=1:5,b=5:1,c=rep(999,5))
for(k in c("a","b")){x[,k] <- x[,..k]+x[,.(c)]}
Now here is the question: Why do I have to use .. for the assignment? Also if I try to use .. in the first case, i.e.
for(k in c("a","b")){x[,..k] <- x[,..k]+x[,.(c)]}
There is an error: "[...]object '..k' not found". This seems strange, that I have to change the syntax within the scope.
Now in dataframe, the equivalent formulation is very clear:
for(k in c("a","b")){x[,k] <- x[,k]+x[,c]} # error with DT
x <- data.frame(a=1:5,b=5:1,c=rep(999,5))
for(k in c("a","b")){x[,k] <- x[,k]+x[,"c"]} # works with dataframe
So I am wondering (1) if the above code is the correct way to do that in datatable (please explain the .. operator, the datatable FAQ 1.1 doesn't address this in particular); and if (2) there are alternative ways to write this in a cleaner way. Thanks for any hints.
from the official introducion (slightly edited for your example):
For those familiar with the Unix terminal, the .. prefix should be
reminiscent of the “up-one-level” command, which is analogous to
what’s happening here – the .. signals to data.table to look for the
k variable “up-one-level”, i.e., in the loop environment
in this case.
So this operator escapes the dataframe and looks for the k variable in one higher level, gets the value and comes back. Not sure why they made it like this, but maybe the variables are not transferred.
You can also use the with argument:
x[,k,with=FALSE]
Edit:
I just checked the source code of data.table. They get the called variable from parent.frame(), so the environment where the function get's called. This is triggered by .. or the with argument. So if you don't use it, the function is not able to get the parameters of the environment.
A question about parent.frame() is found here
Related
I was trying to make Python 3-style assignment unpacking possible in R (e.g., a, *b, c = [1,2,3], "C"), and although I got so close (you can check out my code here), I ultimately ran into a few (weird) problems.
My code is meant to work like this:
a %,*% b %,% c <- c(1,2,3,4,5)
and will assign a = 1, b = c(2,3,4) and c = 5 (my code actually does do this, but with one small snag I will get to later).
In order for this to do anything, I have to define:
`%,%` <- function(lhs, rhs) {
...
}
and
`%,%<-` <- function(lhs, rhs, value) {
...
}
(as well as %,*% and %,*%<-, which are slight variants of the previous functions).
First issue: why R substitutes *tmp* for the lhs argument
As far as I can tell, R evaluates this code from left to right at first (i.e., going from a to c, until it reaches the last %,%, where upon, it goes back from right to left, assigning values along the way. But the first weird thing I noticed is that when I do match.call() or substitute(lhs) in something like x %infix% y <- z, it says that the input into the lhs argument in %infix% is *tmp*, instead of say, a or x.
This is bizarre to me, and I couldn't find any mention of it in the R manual or docs. I actually make use of this weird convention in my code (i.e., it doesn't show this behavior on the righthand side of the assignment, so I can use the presence of the *tmp* input to make %,% behave differently on this side of the assignment), but I don't know why it does this.
Second issue: why R checks for object existence before anything else
My second problem is what makes my code ultimately not work. I noticed that if you start with a variable name on the lefthand side of any assignment, R doesn't seem to even start evaluating the expression---it returns the error object '<variable name>' not found. I.e., if x is not defined, x %infix% y <- z won't evaluate, even if %infix% doesn't actually use or evaluate x.
Why does R behave like this, and can I change it or get around it? If I could to run the code in %,% before R checks to see if x exists, I could probably hack it so that I wouldn't be a problem, and my Python unpacking code would be useful enough to actually share. But as it is now, the first variable needs to already exist, which is just too limiting in my opinion. I know that I could probably do something by changing the <- to a custom infix operator like %<-%, but then my code would be so similar to the zeallot package, that I wouldn't consider it worth it. (It's already very close in what it does, but I like my style better.)
Edit:
Following Ben Bolker's excellent advice, I was able to find a way around the problem... by overwriting <-.
`<-` <- function(x, value) {
base::`<-`(`=`, base::`=`)
find_and_assign(match.call(), parent.frame())
do.call(base::`<-`, list(x = substitute(x), value = substitute(value)),
quote = FALSE, envir = parent.frame())
}
find_and_assign <- function(expr, envir) {
base::`<-`(`<-`, base::`<-`)
base::`<-`(`=`, base::`=`)
while (is.call(expr)) expr <- expr[[2]]
if (!rlang::is_symbol(expr)) return()
var <- rlang::as_string(expr) # A little safer than `as.character()`
if (!exists(var, envir = envir)) {
assign(var, NULL, envir = envir)
}
}
I'm pretty sure that this would be a mortal sin though, right? I can't exactly see how it would mess anything up, but the tingling of my programmer senses tells me this would not be appropriate to share in something like a package... How bad would this be?
For your first question, about *tmp* (and maybe related to your second question):
From Section 3.4.4 of the R Language definition:
Assignment to subsets of a structure is a special case of a general mechanism for complex assignment:
x[3:5] <- 13:15
The result of this command is as if the following had been executed
`*tmp*` <- x
x <- "[<-"(`*tmp*`, 3:5, value=13:15)
rm(`*tmp*`)
Note that the index is first converted to a numeric index and then the elements are replaced sequentially along the numeric index, as if a for loop had been used. Any existing variable called *tmp* will be overwritten and deleted, and this variable name should not be used in code.
The same mechanism can be applied to functions other than [. The replacement function has the same name with <- pasted on. Its last argument, which must be called value, is the new value to be assigned.
I can imagine that your second problem has to do with the first step of the "as if" code: if R is internally trying to evaluate *tmp* <- x, it may be impossible to prevent from trying to evaluate x at this point ...
If you want to dig farther, I think the internal evaluation code used to deal with "complex assignment" (as it seems to be called in the internal comments) is around here ...
am wondering if I an create an object within a for loop - i.e. don't have to initialize it. I have tried this how one might do it in matlab. Please see the following R code:
> for (i in 1:nrow(snp.ids)) {
+ snp.fasta[i]<-entrez_fetch(db="protein", id=snp.ids[i,], rettype="xml",retmode="text")
+ snp.seq[i]<-xpathSApply(xmlParse(snp.fasta[i]), "//Seq-data_iupacaa",xmlValue)
+ }
Error in snp.fasta[i] <- entrez_fetch(db = "protein", id = snp.ids[i, :
object 'snp.fasta' not found
where it obviously does not find snp.fasta - but you can see from the code I am trying to create snp.fasta. can anyone shed any light on why it would not create it within. the for loop, and what would be the proper way to initialize snp.fasta if I cannot create it within the for loop.
Thanks
Generally , yes. That would be an acceptable way to loop over a vector of ids. Just assign to a non-indexed object.
for (i in 1:nrow(snp.ids)) {
snp.fasta <- entrez_fetch(db="protein", id=snp.ids[i,], rettype="xml",retmode="text")
snp.seq <- xpathSApply(xmlParse(snp.fasta), "//Seq-data_iupacaa",xmlValue)
}
(You would then still need to assign any useful result to an index-able object or build a sequence of such within the loop or print some result. As it stands this example will over-write all the values of snp.seq and leave only the last one. )
It's a bit confusing to see id=snp.ids[i,]. That would imply that snp.ids has a dimension of 2. I would have expected a column name or number to be used: id=snp.ids[i,"id"]. You should provide dput(head(snp.ids)) so we can do some realistic testing rather than this half-assed guesswork.
In R, subsetting is also a function, so assigning value to an item in a vector:
a[1] = 123
is identical to
"["(a, 1) = 123
Here [ is a normal function. If a is not defined, there is an error.
Before the loop:
snp.fasta <- NULL
The following simple example will help me address a problem in my program implementation.
fun2<-function(j)
{
x<-rnorm(10)
y<-runif(10)
Sum<-sum(x,y)
Prod<-prod(x,y)
return(Sum)
}
j=1:10
Try<-lapply(j,fun2)
#
I want to store "Prod" at each iteration so I can access it after running the function fun2. I tried using assign() to create space assign("Prod",numeric(10),pos=1)
and then assigning Prod at j-th iteration to Prod[j] but it does not work.
#
Any idea how this can be done?
Thank you
You can add anything you like in the return() command. You could return a list return(list(Sum,Prod)) or a data frame return(data.frame("In"=j,"Sum"=Sum,"Prod"=Prod))
I would then convert that list of data.frames into a single data.frame
Try2 <- do.call(rbind,Try)
Maybe re-think the problem in a more vectorized way, taking advantage of the implied symmetry to represent intermediate values as a matrix and operating on that
ni = 10; nj = 20
x = matrix(rnorm(ni * nj), ni)
y = matrix(runif(ni * nj), ni)
sums = colSums(x + y)
prods = apply(x * y, 2, prod)
Thinking about the vectorized version is as applicable to whatever your 'real' problem is as it is to the sum / prod example; in practice and when thinking in terms of vectors fails I've never used the environment or concatenation approaches in other answers, but rather the simple solution of returning a list or vector.
I have done this before, and it works. Good for a quick fix, but its kind of sloppy. The <<- operator assigns outside the function to the global environment.
fun2<-function(j){
x<-rnorm(10)
y<-runif(10)
Sum<-sum(x,y)
Prod[j]<<-prod(x,y)
}
j=1:10
Prod <- numeric(length(j))
Try<-lapply(j,fun2)
Prod
thelatemail and JeremyS's solutions are probably what you want. Using lists is the normal way to pass back a bunch of different data items and I would encourage you to use it. Quoted here so no one thinks I'm advocating the direct option.
return(list(Sum,Prod))
Having said that, suppose that you really don't want to pass them back, you could also put them directly in the parent environment from within the function using either assign or the superassignment operator. This practice can be looked down on by functional programming purists, but it does work. This is basically what you were originally trying to do.
Here's the superassignment version
fun2<-function(j)
{
x<-rnorm(10)
y<-runif(10)
Sum<-sum(x,y)
Prod[j] <<- prod(x,y)
return(Sum)
}
j=1:10
Prod <- numeric(10)
Try<-lapply(j,fun2)
Note that the superassignment searches back for the first environment in which the variable exists and modifies it there. It's not appropriate for creating new variables above where you are.
And an example version using the environment directly
fun2<-function(j,env)
{
x<-rnorm(10)
y<-runif(10)
Sum<-sum(x,y)
env$Prod[j] <- prod(x,y)
return(Sum)
}
j=1:10
Prod <- numeric(10)
Try<-lapply(j,fun2,env=parent.frame())
Notice that if you had called parent.frame() from within the function you would need to go back two frames because lapply() creates its own. This approach has the advantage that you could pass it any environment you want instead of parent.frame() and the value would be modified there. This is the seldom-used R implementation of writeable passing by reference. It's safer than superassignment because you know where the variable is that is being modified.
I have created a function (which is quite long) that I have saved in a .txt file.
It works well (I use source(< >) to access it).
My problem is that I have created a few variables in that function
ie:
myfun<-function(a,b) {
Var1=....
Var2=Var1 + ..
}
Now I want to get those variables.
When I include return() inside the function, its fine: the value comes up on the screen, but when I type Var1 outside the function, I have an error message "the object cannot be found".
I am new to R, but I was thinking it might be because "myfun" operates in a different envrionment than the global one, but when I did
environment()
environment: R_GlobalEnv>
environment(myfun1)
environment: R_GlobalEnv>
It seems to me the problem is elsewhere...
Any idea?
Thanks
I realize this answer is more than 3 years old but I believe the option you are looking for is as follows:
myfun <- function(a,b) {
Var1 = (a + b) / 2 # do whatever logic you have to do here...
Var2 <<- Var1 + a # then output result to Global Environment with the "<<-" object.
}
The double "<<-" assignment operator will output "Var2" to the global environment and you can then use or reference it however you like without having to use "return()" inside your function.
If you want to do it in a nice way, write a class and than provide a print method. Within this class it is possible to return variables invisible. A nice book which covers such topics is "The Art of R programming".
An easy fix would be save each variable you need later on an list and than return a list
(as Peter pointed out):
return(list(VAR1=VAR1, .....))
I'd like to give a params argument to a function and then attach it so that I can use a instead of params$a everytime I refer to the list element a.
run.simulation<-function(model,params){
attach(params)
#
# Use elements of params as parameters in a simulation
detach(params)
}
Is there a problem with this? If I have defined a global variable named c and have also defined an element named c of the list "params" , whose value would be used after the attach command?
Noah has already pointed out that using attach is a bad idea, even though you see it in some examples and books. There is a way around. You can use "local attach" that's called with. In Noah's dummy example, this would look like
with(params, print(a))
which will yield identical result, but is tidier.
Another possibility is:
run.simulation <- function(model, params){
# Assume params is a list of parameters from
# "params <- list(name1=value1, name2=value2, etc.)"
for (v in 1:length(params)) assign(names(params)[v], params[[v]])
# Use elements of params as parameters in a simulation
}
Easiest way to solve scope problems like this is usually to try something simple out:
a = 1
params = c()
params$a = 2
myfun <- function(params) {
attach(params)
print(a)
detach(params)
}
myfun(params)
The following object(s) are masked _by_ .GlobalEnv:
a
# [1] 1
As you can see, R is picking up the global attribute a here.
It's almost always a good idea to avoid using attach and detach wherever possible -- scope ends up being tricky to handle (incidentally, it's also best to avoid naming variables c -- R will often figure out what you're referring to, but there are so many other letters out there, why risk it?). In addition, I find code using attach/detach almost impossible to decipher.
Jean-Luc's answer helped me immensely for a case that I had a data.frame Dat instead of the list as specified in the OP:
for (v in 1:ncol(Dat)) assign(names(Dat)[v], Dat[,v])