I've been defining some variables with the "<-", like
elect<- copPob12$nivelaprob[copPob12$dispelect==5]
I can see their numerical values on the global environment,
but I want to see how I defined them, to be sure about the function I used, because they are subsets within subsets, I can find them in the "History" tab, but that takes too long,
any function that can retrieve the way I defined the variable on the console?
thanks a lot
As I see the problem, may be you are looking for this:
elect <<- copPob12$nivelaprob[copPob12$dispelect==5]
or you can write
elect <- copPob12$nivelaprob[copPob12$dispelect==5]
assign("elect", elect , envir = .GlobalEnv)
here it changes environment as global so it works within function also
Related
Suppose that after some calculations I obtain a (floating) number, stored in the variable a, for example
a <- sqrt(2)
Now, I want to define a function that uses that parameter, for example:
myfunction <- function(x){x-a}
How can I save myfunction into an RDS file, in such a way that it can be loaded and used in a new R session, where the variable a is not defined?
Or from a different perspective: How to define the function, but substituting for a its actual numerical value in the function definition? That is, I'd like R to actually define the function
myfunction <- function(x){x - 2.1415.....}
where the actual value of a has been substituted in the definition.
Simply trying saveRDS(myfunction, 'myfunction.rds') does not work: if I start a new R session and do
myfunction <- readRDS('myfunction.rds')
myfunction(1)
then R complains that a is not defined.
Please note that I'm here giving a minimal working example of the problem. Obviously, in the case above I could just define myfunction <- function(x){x-sqrt(2)} and save that in an RDS file; it could be loaded in a new session and used without problems.
However, in my case I have many parameters like a, not just one, obtained from long calculations. I'm not interested in saving their values, I only want to save the function that uses them in its definition, and be able to use that function in a new R session.
An RDS file won't save the global environment, but if you create a closure, it will preserve the values in that environment. One such way to do what would be
myfunction <- {function(a) function(x){x-a}}(a)
And then you can call this function like a regular function
myfunction(1)
I'm using
sapply(list.files('scritps/', full.names=TRUE),source)
to run 80 scripts at once in the folder "scripts/" and I do not know exactly how does this work. There are "intermediate" objects equally named across scripts (they are iterative scritps across 80 different biological populations). Does each script only use its own objects? Is there any risk of an script taking the objects of other "previous" script that has not been yet deleted out of the memory, or does this process works exactly like if was run manually sequentially one by one?
Many thanks in advance.
The quick answer is: each script runs independently. Imagine you run a for loop iterating through all the script files instead of using sapply - it should be the same in results.
To prove my thoughts, I just did an experiment:
# This is foo.R
x <- mtcars
write.csv(x, "foo.csv")
# This is bar.R
x <- iris
write.csv(x, "bar.csv")
# Run them at once
sapply(list.files(), source)
Though the default of "local" argument in source is FALSE, it turns out that I have two different csv files in my working directory, one named "foo.csv" with mtcars data frame, and the other named "bar.csv" with iris data frame.
There are global variables that you can declare out a function. As it's name says they are global and can be re-evaluated. If you declare a var into a function it will be local variable and only will take effect inside this concrete function, it will not exists out of its own function.
Example:
Var globalVar = 'i am global';
Function foo(){
Var localVar = 'i don't exist out of foo function';
}
If you declared globalVar on the first script, and you call it on the latest one, it will answer. If you declared localVar on some script and you call it into another or out of the functions or in another function you'll get an error (var localVar is not declared / can't be found).
Edit:
Perhaps, if there aren't dependences between scripts (you don't need one to finish to continue with another) there's no matter on running them on parallel or running them secuentialy. The behaviour will be the same.
You've only to take care with global vars, local ones can't infer into another script/function.
I'm having a problem with the below function:
ab<-matrix(c(1:20),nrow=4)
rownames(ab)<-c("a","b","c","d")
cd<-c("a","c")
test<-function(x,y,ID_Tag){
for(i in y) {
M_scaled<-t(scale(t(x),center=T))
a<-quantile(M_scaled[match(i,rownames(x)),])
assign(paste0("Probes_",ID_Tag,"_quan_",i),a)
}
}
test(ab,cd,"C1")
x is the dataframe/matrix
y is the string I need to search for in rownames(x)
ID_Tag is is the number I use to distinguish my samples from each other.
The function is running, but no output is generated into strings afterwards.
Hope somebody can help me
When you use assign within a function it will make the assignment to a variable that is accessible within that function only (i.e. it's like using <-). To get around this, you need to specify the envir argument in assign to be either the global environment globalenv() or the parent frame of the function. So try changing your assign statement to
assign(..., envir = parent.frame())
or
assign(..., envir = globalenv())
depending on what you want exactly (in the example you provided they are equivalent). Have a look at ?parent.frame for more info on these. Another possibility is to specify the pos argument in assign, check ?assign.
As an aside, assigning global objects from within a function can lead to various problems in general. I find it better practice in your example to return a list of objects created in the for loop rather than use assign.
I want to initialise some variables from an external data file. One way is to set a file like the following foo.csv:
var1,var2,var3
value1,value2,value3
Then issue:
attach(read.csv('foo.csv'))
The problem is that in this way var1, var2, var3 are not shown by ls() and most of all rm(all=ls()) doesn't clean all anymore and var1, var2, var3 are still there.
As the default position for new objects is '2', I can remove the workspace where this variables live via:
detach(pos=2)
or simply
detach()
Since pos=2 is the default for detach too.
But detach() is "too" powerful and it can delete R objects loaded by default. This means that, if one attaches many datasets, removing them with repeated detach can lead to delete also default R objects and you have to restart it. Besides the simplicity of the single rm(all=ls()) goes away.
One solution will be to attach var1, var2, var3 straight to the the global environment.
Do you know how to do that?
attach(read.csv('foo.csv'), pos=1)
issues a warning (future error).
attach(read.csv('foo.csv'), pos=-1)
seems ineffective.
If you want to read the variables straight into the global environment, you can do this:
{
foo<-read.csv('foo.csv')
for(n in names(foo)) assign(n,foo[[n]],globalenv())
}
The braces will prevent foo from also being added to the global environment. You can also make this into a function if you want.
Use the named variant of attach and detach:
attach(read.csv(text='var1,var2,var3\nvalue1,value2,value3'),
name = 'some_name')
and
detach('some_name')
This will prevent mistakes. You’d obviously wrap these two into functions and generate the names automatically in an appropriate manner (the easiest being via a monotonically increasing counter).
What about a 'safer' attach?
attach<-function(x) {for(n in names(x)) assign(n,x[[n]],globalenv()); names(x)}
'Safer' means you can see attached vars with ls() and most of all remove them with a single rm(list=ls())
Inspired by mrip
I have a file with an R program. I load it interactively on R and then I call the main function. During the execution I fill up some global lists but when I run the main function again I want those lists to be empty. How can I empty a filled up list? I tried doing
list <- NULL
after execution but it didn't work.
Since you are setting them globally, you probably need list <<- NULL, because the <<- operator assigns global variables.
Edit, per #Dason's comment:
The <<- operator can in some circumstances fail to change the value of a variable in all possible environments. For that reason, it is better to use
assign("list", NULL, envir = .GlobalEnv)