I have been trying to use am R function called ipsi, which takes arguments (a, y, id, time, x.trt, x.out, delta.seq, nsplits) Originally, the components of the arguments were in one dataframe (except for delta.seq and nsplits which are coded later), but my understanding is I needed to put them in separate lists, and in the case of x.trt and x.out, matrices. This function is very easy to run on one of each argument, but since I multiply imputed the dataframe 30 times before splitting it up into different elements to be taken as ipsi arguments, I now want to iterate over the set of elements 30 times as if there were 30 dataframes. Additionally, I want to parallelize to optimize my computing power.
I have just expanded the npcausal example:
n <- 500
T <- 4
time <- rep(1:T, n)
time <- list(time,time,time,time,time,time,time,time,time,time,time,time,time,time,time,
time,time,time,time,time,time,time,time,time,time,time,time,time,time,time)
id <- rep(1:n, rep(T, n))
id <- list(id,id,id,id,id,id,id,id,id,id,id,id,id,id,id,
id,id,id,id,id,id,id,id,id,id,id,id,id,id,id)
x.trt <- matrix(rnorm(n * T * 5), nrow = n * T)
x.trt <- list(x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,
x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt,x.trt)
x.out <- matrix(rnorm(n * T * 5), nrow = n * T)
x.out <- list(x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,
x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out,x.out)
a <- rbinom(n * T, 1, .5)
a <- list(a,a,a,a,a,a,a,a,a,a,a,a,a,a,a,
a,a,a,a,a,a,a,a,a,a,a,a,a,a,a)
y <- rnorm(mean=1,n)
y <- list(y,y,y,y,y,y,y,y,y,y,y,y,y,y,y,
y,y,y,y,y,y,y,y,y,y,y,y,y,y,y)
d.seq <- seq(0.1, 5, length.out = 10)
d.seq <- list(d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,
d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq,d.seq)
set.seed(500, kind = "L'Ecuyer-CMRG")
numcores <- future::availableCores()
cl <- parallel::makeCluster(numcores)
parallel::clusterEvalQ(cl, library(dplyr))
parallel::clusterEvalQ(cl, library(npcausal))
parallel::clusterExport(cl, "d.seq", envir = environment())
parallel::clusterEvalQ(cl, d.seq <- d.seq)
new_element <- parallel::parLapply(cl = cl, for(i in 1:30){
npcausal::ipsi(a = a[[i]],
y = y[[i]],
id = id[[i]],
time = time[[i]],
x.out = x.out[[i]],
x.trt = x.trt[[i]],
delta.seq = d.seq[[i]],
nsplits = 10)
})
This actually runs, but at the end of the process it gives me an error saying that the FUN was missing. I knew that already, but I have no FUN to call besides ipsi. Thanks for any help you can provide.
My suggestion is to first figure out how to do it with a regular base-R *apply function without worrying about parallelization. I suspect you can use mapply() for this, so something like (non confirmed):
res <- mapply(
a, y, id, time, xout, x,out, x.trt, d.seq,
FUN = function(a_i, y_i, id_i, time_i, xout_i, x,out_i, x.trt_i, d.seq_i) {
npcausal::ipsi(a = a_i, y = y_i, id = id_i, time = time_i,
x.out = x.out_i, x.trt = x.trt_i, delta.seq = d.seq_i,
nsplits = 10)
}
)
When you figured that part out, you can start thinking about parallelization.
(Disclaimer: I'm the author) If you get an mapply() solution to work, then the simplest would be to replace that as-is with future_mapply() of the future.apply package. That will parallelize on your local machine if you set plan(multisession).
I am struggling with this for so long. I have a logistic growth function where the growth parameter
r is a matrix. The model is constructed in a way that I have as an output two N the N1 and N2.
I would like to be able to change the r parameter over time. When time < 50 I would like
r = r1 where
r1=matrix(c(
2,3),
nrow=1, ncol=2
When time >= 50 I would like r=r2 where
r2=matrix(c(
1,2),
nrow=1, ncol=2
Here is my function. Any help is highly appreciated.
rm(list = ls())
library(deSolve)
model <- function(time, y, params) {
with(as.list(c(y,params)),{
N = y[paste("N",1:2, sep = "")]
dN <- r*N*(1-N/K)
return(list(c(dN)))
})
}
r=matrix(c(
4,5),
nrow=1, ncol=2)
K=100
params <- list(r,K)
y<- c(N1=0.1, N2=0.2)
times <- seq(0,100,1)
out <- ode(y, times, model, params)
plot(out)
I would like ideally something like this but it does not work
model <- function(time, y, params) {
with(as.list(c(y,params)),{
N = y[paste("N",1:2, sep = "")]
r = ifelse(times < 10, matrix(c(1,3),nrow=1, ncol=2),
ifelse(times > 10, matrix(c(1,4),nrow=1, ncol=2), matrix(c(1,2),nrow=1, ncol=2)))
print(r)
dN <- r*N*(1-N/K)
return(list(c(dN)))
})
}
Thank you for your time.
Here a generic approach that uses an extended version of the approx function. Note also some further simplifications of the model function and the additional plot of the parameter values.
Edit changed according to the suggestion of Lewis Carter to make the parameter change at t=3, so that the effect can be seen.
library(simecol) # contains approxTime, a vector version of approx
model <- function(time, N, params) {
r <- approxTime(params$signal, time, rule = 2, f=0, method="constant")[-1]
K <- params$K
dN <- r*N*(1-N/K)
return(list(c(dN), r))
}
signal <- matrix(
# time, r[1, 2],
c( 0, 2, 3,
3, 1, 2,
100, 1, 2), ncol=3, byrow=TRUE
)
## test of the interpolation
approxTime(signal, c(1, 2.9, 3, 100), rule = 2, f=0, method="constant")
params <- list(signal = signal, K = 100)
y <- c(N1=0.1, N2=0.2)
times <- seq(0, 10, 0.1)
out <- ode(y, times, model, params)
plot(out)
For a small number of state variables like in the example, separate signals with approxfun from package stats will look less generic but may be slighlty faster.
As a further improvement, one may consider to replace the "hard" transitions with a more smooth one. This can then directly be formulated as a function without the need of approx, approxfun or approxTime.
Edit 2:
Package simecol imports deSolve, and we need only a small function from it. So instead of loading simecol it is also possible to include the approxTime function explicitly in the code. The conversion from data frame to matrix improves performance, but a matrix is preferred anyway in such cases.
approxTime <- function(x, xout, ...) {
if (is.data.frame(x)) {x <- as.matrix(x); wasdf <- TRUE} else wasdf <- FALSE
if (!is.matrix(x)) stop("x must be a matrix or data frame")
m <- ncol(x)
y <- matrix(0, nrow=length(xout), ncol=m)
y[,1] <- xout
for (i in 2:m) {
y[,i] <- as.vector(approx(x[,1], x[,i], xout, ...)$y)
}
if (wasdf) y <- as.data.frame(y)
names(y) <- dimnames(x)[[2]]
y
}
If you want to pass a matrix parameter you should pass a list of parameters and you can modify it inside the model when your time limit is exceeded (in the example below you don't even have to pass the r matrix to the model function)
library(deSolve)
model <- function(time, y, params) {
with(as.list(c(y,params)),{
if(time < 3) r = matrix(c(2,3), nrow = 1, ncol = 2)
else r = matrix(c(1,3), nrow = 1, ncol = 2)
N = y[paste("N",1:2, sep = "")]
dN <- r*N*(1-N/K)
return(list(c(dN)))
})
}
y <- c(N1=0.1, N2=0.2)
params <- list(r = matrix(c(0,0), nrow = 1, ncol = 2), K=100)
times <- seq(0,10,0.1)
out <- ode(y, times, model, params)
plot(out)
You can see examples of this for instance with Delay Differential Equations ?dede
I'm building predator-prey models based on Lotka-Volterra derivatives in R using package deSolve. I define parameters, initial state and timesteps and the model function. Then I solve everyting using ode() or dede() when using a time lag.
I noticed there's a big difference in output depending on how you define the parameters WITHIN the model function and I really don't understand why. You can extract the parameters either by calling them via the argument: parms['r'], or via the previously defined object I passed to the argument: parameters['r']. Same result in both cases.
This is different voor the initial state though: calling the argument: y[1] or y['N'], gives a totally different result than calling it via the object passed to the argument: init[1] or init['N'].
Also in the DDE: there's a difference in time - tau vs times - tau and ylag <- y vs ylag <- init.
Why is there a different result for argument vs object for the initial state and time and not for the parameters? I need to comprehend this well in order to use the FME package in a later stage, so I hope someone can explain this behaviour.
My code:
library(deSolve)
## Parameters
parameters <- c(r = 0.25, K = 200, a = 0.01, c = 0.01, m = 1, tau = 7)
init <- c(N = 20, P = 2)
time <- seq(0, 100, by = 0.01)
## Ordinary DE
PreyPred <- function(times, y, parms){ #chose same argument names as ode()
N <- y['N'] #y[1] works as well
P <- y['P']
#N <- init['N'] #(or init[1]) gives a totally different result!
#P <- init['P']
r <- parms['r'] #growth rate prey parameters['r'] gives same result
K <- parms['K'] #carrying capacity prey
a <- parms['a'] #attack rate predator
c <- parms['c'] #assimilation rate (?) predator
m <- parms['m'] #mortality predator
dN <- r * N * (1-N/K) - a * N * P
dP <- c * N * P - m * P
return(list(c(dN, dP)))
}
oderesult <- ode(func = PreyPred, parms = parameters, y = init, times = time)
plot(oderesult, lwd = 2, mfrow = c(1,2))
## Delayed DE
PreyPredLag <- function(times, y, parms){
N <- y['N']
P <- y['P']
#N <- init['N']
#P <- init['P']
r <- parms['r'] #growth rate prey
K <- parms['K'] #carrying capacity prey
a <- parms['a'] #attack rate predator
c <- parms['c'] #assimilation rate (?) predator
m <- parms['m'] #mortality predator
tau <- parms['tau'] #time lag
tlag <- times - tau
#tlag <- time - tau #different result
if (tlag < 0)
ylag <- y
#ylag <- init
else
ylag <- lagvalue(tlag)
# dede
dN <- r * N * (1-N/K) - a * N * P
dP <- c * ylag[1] * ylag[2] - m * P
return(list(c(dN, dP), lag = ylag))
}
dederesult <- dede(func = PreyPredLag, parms = parameters, y = init, times = time)
plot(dederesult, lwd = 2, mfrow = c(2,2))
The observed behavior is correct. A short explanation:
'parms' is the local variable in the model function, while 'parameters' the global variable in the workspace. This is nothing special for deSolve, it is the general way how R works. The in most cases preferred way is to use the local variable.
For the states, this is indeed different. here the outer value 'init' is the initial value at the beginning, while the local 'y' is the current value for the time step.
The dede parameters are analogous. init is the start, y the instantaneous value, times is the global vector of all time steps and time the actual time step.
I am trying to simulate a model with this differential equation for concentration A:
dA/dt = (a-b)*exp^(d*(s-t))
(The equations has parameters: a, b, d, and s.) I can not figure out how to use R to solve differential equations that have a t (time step) variable? I tried it with the function radau of the package deSolve (See beneath). I did not get the code to work. I also do not understand how to define the index variable? Or if this is solvable with this function at all? (All my other simpler differential equations I have ran in the past with the ode function of deSolve, worked fine).
I hope you can help me!
My try:
#Defining parameters
parameter <- c(a=0.03, b=0.02, d=0.01, s=179)
#Defining Function
Function1 <- function(t, y, parameter) { with (as.list(Y),
list(c(dA = (a-b)*exp^(d*(s-t)))))}
#Initial conditions
yini <- c(A=1)
#Mass matrix
M <- diag(nrow=1)
M[5,5] <- 0
M
#index/times/output
index <- c(1)
times <- seq(from = 0, to = 10, by = 0.01)
out <- radau(y = yini, func = Function1, parms = parameters, times = times, mass = M, nind = index)
plot(out, type = "l", lwd = 2)
I'm not sure what's up with M or index as they don't appear in your model, but here's code that runs and produces results based on your code.
#Defining parameters
parameter <- c(a=0.03, b=0.02, d=0.01, s=179)
#Defining Function
model <- function(t, y, parameter) {
with(as.list(parameter),{
dA <- (a - b) * exp(d * (s - t))
list(dA)
})
}
#Initial conditions
yini <- 1
# Output times
times <- seq(from = 0, to = 10, by = 0.01)
# Solve model
out <- ode(y = yini, func = model, parms = parameter, times = times)
# Plot results
plot(out, type = "l", lwd = 2)