Error in parallel process using doSNOW - r

Error in { : task 1 failed - "invalid connection"
Why do I get this error, every time when I try to use all 4 cores for a parallel process.
Here is the example code:
NumberOfCluster <- 4
cl <- makeCluster(NumberOfCluster)
registerDoSNOW(cl)
fl<- file(file.choose(),"r") # file.choose() is going to locate a file(.tsv)
# of size 8 gb (RAM is 4 GB)
foreach(i=1:3) %dopar% {
View(name_fil <- read.delim(fl,nrows = 1000000,header = TRUE))
}

You're getting an error because file objects can't be exported to the workers. Instead, you could export the name of the file and open that file on each of the workers:
fname <- file.choose()
foreach(i=1:3) %dopar% {
fl <- file(fname, "r")
View(name_fil <- read.delim(fl,nrows = 1000000,header = TRUE))
}
You may run into problems using the View function next, but this should solve the "invalid connection" error.

Related

doSNOW and Foreach loop (R) on cluster?

I am using a cluster to run a foreach loop in parallel, using doSNOW. The loop works on my desktop, I receive this warning when running on the cluster
Execution halted
Error in unserialize(node$con) : error reading from connection
Calls: local ... doTryCatch -> recvData -> recvData.SOCKnode -> unserialize
The loop is rather large, so I have just provided a very basic sample here (I do not believe the error is in the loop, as it works on the desktop).
library(sp)
library(raster)
library(fields)
library(tidyr)
library(dplyr)
library(sphereplot)
library(dismo)
library(doSNOW)
library(parallel)
cores <- (detectCores()/2)/2
print(cores)
cl <- makeCluster(cores, type = "SOCK", outfile = "")
registerDoSNOW(cl)
FossilClimCoV <- foreach(i = 0:5,.combine = "rbind",
.packages = paste(c("dplyr","dismo",
"sp","raster",
"fields",
"tidyr","sphereplot","doSNOW","parallel"
)))%dopar%{
print(i)
FossilTemp <- Fossils%>%dplyr::filter(Age == i)
if(nrow(FossilTemp)>0){
BULK removed for ease
return(FossilTemp1)
}
}
I'm not sure how to fix this error. I don't understand why it will not work on the cluster, but will on my desktop.
EDIT 1
I have now resolved this large error by changing from a doSNOW backend to doParallel.
library(doParallel)
registerDoParallel(cores=3)
*foreach loop*
However, I now have a new error:
Calls: %dopar% -> <Anonymous>
Execution halted
If I change the errorhandling to "remove" the foreach loop will always return an empty vector.

How to fix C function R_nc4_get_vara_double returned error in ncdf4 parallel processing in R

I want to download nc data through OPENDAP from a remote storage. I use parallel backend with foreach - dopar loop as follows:
# INPUTS
inputs=commandArgs(trailingOnly = T)
interimpath=as.character(inputs[1])
gcm=as.character(inputs[2])
period=as.character(inputs[3])
var=as.character(inputs[4])
source='MACAV2'
cat('\n\n EXTRACTING DATA FOR',var, gcm, period, '\n\n')
# CHANGING LIBRARY PATHS
.libPaths("/storage/home/htn5098/local_lib/R40") # local library for packages
setwd('/storage/work/h/htn5098/DataAnalysis')
source('./src/Rcodes/CWD_function_package.R') # Calling the function Rscript
# CALLING PACKAGES
library(foreach)
library(doParallel)
library(parallel)
library(filematrix)
# REGISTERING CORES FOR PARALLEL PROCESSING
no_cores <- detectCores()
cl <- makeCluster(no_cores)
registerDoParallel(cl)
invisible(clusterEvalQ(cl,.libPaths("/storage/home/htn5098/local_lib/R40"))) # Really have to import library paths into the workers
invisible(clusterEvalQ(cl, c(library(ncdf4))))
# EXTRACTING DATA FROM THE .NC FILES TO MATRIX FORM
url <- readLines('./data/external/MACAV2_OPENDAP_allvar_allgcm_allperiod.txt')
links <- grep(x = url,pattern = paste0('.*',var,'.*',gcm,'_.*',period), value = T)
start=c(659,93,1) # lon, lat, time
count=c(527,307,-1)
spfile <- read.csv('./data/external/SERC_MACAV2_Elev.csv',header = T)
grids <- sort(unique(spfile$Grid))
clusterExport(cl,list('ncarray2matrix','start','count','grids')) #exporting data into clusters for parallel processing
cat('\nChecking when downloading all grids\n')
# k <- foreach(x = links,.packages = c('ncdf4')) %dopar% {
# nc <- nc_open(x)
# nc.var=ncvar_get(nc,varid=names(nc$var),start=start,count=count)
# return(nc.var)
# nc_close(nc)
# }
k <- foreach(x = links,.packages = c('ncdf4'),.errorhandling = 'pass') %dopar% {
nc <- nc_open(x)
print(nc)
nc.var=ncvar_get(nc,varid=names(nc$var),start=c(659,93,1),count=c(527,307,-1))
nc_close(nc)
return(dim(nc.var))
Sys.sleep(10)
}
# k <- parSapply(cl,links,function(x) {
# nc <- nc_open(x)
# nc.var=ncvar_get(nc,varid=names(nc$var),start=start,count=count)
# nc_close(nc)
# return(nc.var)
# })
print(k)
However, I keep getting this error:
<simpleError in ncvar_get_inner(ncid2use, varid2use, nc$var[[li]]$missval, addOffset, scaleFact, start = start, count = count, verbose = verbose, signedbyte = signedbyte, collapse_degen = collapse_degen): C function R_nc4_get_vara_double returned error>
What could be the reason for this problem? Can you recommend a solution for this that is time-efficient (I have to repeat this for about 20 files)?
Thank you.
I had the same error in my code. The problem was not the code itself. It was one of the files that I wanted to read. It has something wrong, so R couldn't open it. I identified the file and downloaded it again, and the same code worked perfectly.
I also encountered the same error. For me, restarting R session did the trick.

fread file which has abnormal value and R session aborted

I am trying to fread a 80 csv files of size 350 to 400MB( not at the same time). I have used trycatch for exception handling i.e if any one file has abnormal values then to proceed with the loop, but either it is not executing the rest of the loop or it is showing session aborted and restarting R session.
The below code throws no error but does not execute completely.
Today <- Sys.Date()
for(k in 1:length(Dir)){
for(i in 1:length(server_name)){
setwd(Dir[[k]])
myFiles <- list.files(pattern= server_name[i])
Data <- data.table()
Data <- tryCatch ((fread(myFiles,sep=",",header=TRUE,showProgress = TRUE,verbose=TRUE,fill=TRUE)), error= function(err){
errMess <- paste0("Not available -" , myFiles)
write(errMess, "error_log.txt")
})
if (nrow(Data)!=0) {
##list of actions to be executed
setwd("C:/D Drive data/Enrichment/RDS File1")
saveRDS(Data, file = paste0(Today,"_",server_name[i], ".RDS"))
}
}
Today <- Today - 1
}
If i don't handle the exceptions then it gives error R session aborted and restarts.

Error in serialize(data, node$con) : error writing to connection

I'm currently trying to run some code that implements parallel processing, but I'm running into this error:
Error: cannot allocate vector of size 2.1 Gb
Execution halted
Error in serialize(data, node$con) : error writing to connection
Calls: %dopar% ... postNode -> sendData -> sendData.SOCKnode -> serialize
Execution halted
Warning message:
system call failed: Cannot allocate memory
Error in unserialize(node$con) : error reading from connection
Calls: <Anonymous> ... doTryCatch -> recvData -> recvData.SOCKnode ->
unserialize Execution halted
I can't seem to figure out why there's a memory problem. If I take the code out of the foreach loop or change the foreach to a for loop, it works perfectly fine, so I don't think it has to do with the contents of the code itself, but rather something about the parallelization. Also, it seems to throw the error pretty soon after the code starts executing. Any ideas why this might be happening? Here's a look at my code:
list_storer <- list()
list_storer <- foreach(bt=2:bootreps, .combine=list, .multicombine=TRUE) %dopar% {
ur <- sample.int(nrow(dailydatyr),nrow(dailydatyr),replace=TRUE)
ddyr_boot <- dailydatyr[ur,]
weightvar <- ddyr_boot[,c('ymd1_IssueD','MatD_ymd2')]
weightvar <- abs(weightvar)
x <- DM[ur,]
y<-log(ddyr_boot$dirtyprice2/ddyr_boot$dirtyprice1)
weightings <- rep(1,nrow(ddyr_boot))
weightings <- weightings/(ddyr_boot$datenum2-ddyr_boot$datenum1)
treg <- repeatsales(y,x,maxdailyreturn,weightings,weightvar)
zbtcol <- 0
cnst <- NULL
if (is.null(dums) == FALSE){
zbtcol <- length(treg)-ncol(x)
cnst <- paste("tbs(",dums,")_",(middleyr),sep="")
if (is.null(interactVar) == FALSE){
ninteract <- (length(treg)-ncol(x)-length(dums))/length(dums)
interact <- unlist(lapply(cnst,function(xla) paste(xla,"*c",c(1:ninteract),sep="")))
cnst <- c(cnst,interact)}
}
}
tregtotal <- tregtotal + (is.na(treg)==FALSE)
treg[is.na(treg)==TRUE] <- 0
list_storer[[length(list_storer)+1]] <- treg
}
stopImplicitCluster(cl)
Parallelisation as done by foreach is a space vs. time trade-off. We get faster execution at the expense of higher memory usage. The reason for the higher memory usage is that several R process are started and each of them needs it’s own memory to hold the data necessary for the calculation. Currently foreach is using an implicit PSOCK cluster. One way to solve this is to make the cluster creation explicit using a lower number of processes. How low depends on the amount of memory you have and on the memory requirements of each job:
n <- parallel::detectCores()/2 # experiment!
cl <- parallel::makeCluster(n)
doParallel::registerDoParallel(cl)
<foreach>
parallel::stopCluster(cl)

How to implement parallel jags on Windows with foreach?

I would like to run jags models in parallel on my windows computer with 4 cores, but have not been able to figure out why my model will not run. I have searched the web extensively including these posts:
http://andrewgelman.com/2011/07/23/parallel-jags-rngs/
http://users.soe.ucsc.edu/~draper/eBay-Google-2013-parallel-rjags-example.txt
When I run a simple example (see code below) with %do%, the model runs fine (serially of course). When I use %dopar%, I receive the error:
Error in { : task 1 failed - "Symbol table is empty"
library(rjags)
library(coda)
library(foreach)
library(doParallel)
library(random)
load.module("lecuyer")
### Data generation
y <- rnorm(100)
n <- length(y)
win.data <- list(y=y, n=n)
# Define model
sink("model.txt")
cat("
model {
# Priors
mu ~ dnorm(0, 0.001)
tau <- 1 / (sigma * sigma)
sigma ~ dunif(0, 10)
# Likelihood
for (i in 1:n) {
y[i] ~ dnorm(mu, tau)
}
}
",fill=TRUE)
sink()
inits <- function(){ list(mu=rnorm(1), sigma=runif(1, 0, 10),
.RNG.name = "lecuyer::RngStream",
.RNG.seed = as.numeric(randomNumbers( n = 1, min = 1, max = 1e+06, col = 1 )) ) }
params <- c('mu','sigma')
cl <- makePSOCKcluster(3)
clusterSetRNGStream(cl)
registerDoParallel(cl)
model.wd <- paste(getwd(), '/model.txt', sep='') # I wondered if the cores were having trouble finding the model.
m <- foreach(i=1:3, .packages=c('rjags','random','coda'), .multicombine=TRUE) %dopar% {
load.module( "lecuyer" )
model.jags <- jags.model(model.wd, win.data, inits=inits, n.chains=1, n.adapt=1000, quiet=TRUE)
result <- coda.samples(model.jags, params, 1000, thin=5)
return(result)
}
stopCluster(cl)
# Error in { : task 1 failed - "Symbol table is empty
sessionInfo()
# R version 3.0.1 (2013-05-16)
# Platform: x86_64-w64-mingw32/x64 (64-bit)
#
# locale:
# [1] LC_COLLATE=English_Canada.1252 LC_CTYPE=English_Canada.1252 LC_MONETARY=English_Canada.1252
# [4] LC_NUMERIC=C LC_TIME=English_Canada.1252
#
# attached base packages:
# [1] parallel stats graphics grDevices utils datasets methods base
#
# other attached packages:
# [1] random_0.2.1 doParallel_1.0.3 iterators_1.0.6 foreach_1.4.1 rjags_3-10 coda_0.16-1
# [7] lattice_0.20-21
#
# loaded via a namespace (and not attached):
# [1] codetools_0.2-8 compiler_3.0.1 grid_3.0.1 tools_3.0.1
More Details:
The problem occurs on a Windows 7 computer with NO admin privaleges, but not on a computer WITH admin privaleges. The problem occurs with Rgui and Rterm and with the new rjags packaged 3-11. The error message occurs within the function jags.model
The problem appears to stem from a mismatch in writing and reading files to a temporary directory. When I start R, it automatically creates a temporary folder. When I close R, this folder is automatically deleted, unless it contains files.
For example, when I start R it creates this folder:
C:\Users\jesse whittington\AppData\Local\Temp\RtmpoBe1gw.
When I run a rjags model with
m <- jags.model(file='model.txt', data=win.data, inits=inits, n.chains=3, n.adapt=1000, quiet=FALSE)
No files are written to this temporary directory.
When I run 3 chains serially with foreach and %do%, 3 temporary files are written to this folder. These files are 1 kb in size and when I open with a text editor they appear blank.
wd <- getwd()
cl <- makePSOCKcluster(3, outfile=paste(wd,'/Out_messages.txt', sep='')) # 3 chains
clusterSetRNGStream(cl)
registerDoParallel(cl)
m <- foreach(i=1:3, .packages=c('rjags','random','coda'), .multicombine=TRUE) %do% {
load.module( "lecuyer" )
result <- jags.model(file='model.txt', data=win.data, inits=inits, n.chains=1, n.adapt=1000, quiet=FALSE)
return(result)
}
stopCluster(cl)
When I run 3 chains in parallel with foreach and %dopar%, 3 temporary files are written to the folder ..Temp\RtmpoBe1gw. The error messages in the outfile suggest that the function is looking for DIFFERENT files in DIFFERENT temporary directories. When, I include a line to create a tempfile directory and name, I see that 3 new temporary folders are created (they are later deleted with stopCluster). jags.model looks in these 3 folders for the temporary files and fails because there is nothing in them. Thus, I suspect tempfiles are written to one temporary directory (associated with the parent R session) and then fails when trying to open different tmpfiles in the 3 temporary directories created within foreach.
wd <- getwd()
cl <- makePSOCKcluster(3, outfile=paste(wd,'/Out_messages.txt', sep='')) # 3 chains
clusterSetRNGStream(cl)
registerDoParallel(cl)
m <- foreach(i=1:3, .packages=c('rjags','random','coda'), .multicombine=TRUE) %dopar% {
load.module( "lecuyer" )
tmp <- tempfile()
print(tmp)
result <- jags.model(file='model.txt', data=win.data, inits=inits, n.chains=1, n.adapt=1000, quiet=FALSE)
return(result)
}
stopCluster(cl)
From Out_messages.txt
starting worker pid=4396 on localhost:11109 at 08:34:06.430
starting worker pid=6548 on localhost:11109 at 08:34:06.879
starting worker pid=6212 on localhost:11109 at 08:34:07.418
Loading required package: coda
Loading required package: lattice
Loading required package: coda
Loading required package: lattice
Loading required package: coda
Loading required package: lattice
Linked to JAGS 3.3.0
Loaded modules: basemod,bugs
Linked to JAGS 3.3.0
Loaded modules: basemod,bugs
Linked to JAGS 3.3.0
Loaded modules: basemod,bugs
module lecuyer loaded
module lecuyer loaded
module lecuyer loaded
[1] "C:\\Users\\JESSEW~1\\AppData\\Local\\Temp\\RtmpQbPAVC\\file112c8077a0" # Note this is from: tmp <- tempfile()
[1] "C:\\Users\\JESSEW~1\\AppData\\Local\\Temp\\RtmpMPMpcY\\file199489564c6"
[1] "C:\\Users\\JESSEW~1\\AppData\\Local\\Temp\\Rtmpk9vMR5\\file18445f6b2fd4"
Compiling model graph
Compiling model graph
Compiling model graph
Warning messages:
1: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Unused variable "y" in data
2: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Unused variable "n" in data
3: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Failed to open file C:\Users\JESSEW~1\AppData\Local\Temp\RtmpQbPAVC\file112c394b4eef
Nothing to compile
4: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Unused initial value for "mu" in chain 1
5: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Unused initial value for "sigma" in chain 1
6: In jags.model(file = "model.txt", data = win.data, inits = inits, :
Can't initialize. No nodes in graph (Have you compiled the model?)
The folder RtmpQbPAVC is created but the file file112c394b4eef does not exist.
Steve brought this to my attention, but your second example shows that it is not a problem with rjags. I am unable to reproduce the bug in either example using the same setup (Windows 7, R 3.0.1, JAGS 3.0.3, ordinary user without admin access).
Since the errors are caused by writing and reading the model file, I suggest that you bypass that issue by using the "textConnection" function. This can be used to create a file-like object without creating an actual file, thus avoiding the need for temporary files. I modified your example to demonstrate this:
library(rjags)
library(doParallel)
library(random)
load.module("lecuyer")
y <- rnorm(100)
n <- length(y)
win.data <- list(y=y, n=n)
model <- "
model {
# Priors
mu ~ dnorm(0, 0.001)
tau <- 1 / (sigma * sigma)
sigma ~ dunif(0, 10)
# Likelihood
for (i in 1:n) {
y[i] ~ dnorm(mu, tau)
}
}"
inits <- function() {
list(mu=rnorm(1), sigma=runif(1, 0, 10),
.RNG.name="lecuyer::RngStream",
.RNG.seed=as.numeric(randomNumbers(n=1, min=1, max=1e+06, col=1)))
}
params <- c('mu', 'sigma')
cl <- makePSOCKcluster(3)
clusterSetRNGStream(cl)
registerDoParallel(cl)
m <- foreach(i=1:3, .packages=c('rjags', 'random'),
.combine='c', .final=mcmc.list) %dopar% {
load.module( "lecuyer" )
model.jags <- jags.model(textConnection(model), win.data, inits=inits,
n.chains=1, n.adapt=1000, quiet=TRUE)
coda.samples(model.jags, params, 1000, thin=5)
}
I also changed the result handling so that the value returned by the foreach loop is an "mcmc.list" object, which is what the "coda.samples" function returns.
I have identified the source of the problem.
I can write and read files to and from a temporary directory when using R normally.
When in parallel, I can write files to the temporary directories, but I do NOT have permission to read files.
The problem occurs both writing and reading text files (using writeLines and readLines) and csv files.
I have since found that if I receive this message: "Error in { : task 1 failed - cannot open the connection", I can rectify the problem by deleting all temporary files in TEMP. For some locked files, I have to shut down and restart the computer before I am able to delete the necessary files. Even so, within the same R session I might receive the error message and then be able to successful run the program on my next try. The problem likely stems from our government anti-virus software and/or the structure of our remote network access.
Here is an example that writes and reads text files for simplicity.
library(foreach)
library(doParallel)
wd <- getwd()
data <- data.frame(x=1:10, y=1:10)
This works fine.
modfile <- tempfile()
print(modfile)
# "C:\\Users\\JESSEW~1\\AppData\\Local\\Temp\\RtmpsvYfFk\\filef38a272022"
write.csv(data, modfile, row.names=F)
m <- read.csv(modfile)
This does not work
cl <- makePSOCKcluster(3, outfile=paste(wd,'/Out_messages.txt', sep='')) # 3 chains
clusterSetRNGStream(cl)
registerDoParallel(cl)
m <- foreach(i=1:3) %dopar% {
modfile <- tempfile()
write.csv(data, modfile, row.names=F)
x <- read.csv(modfile)
return(x)
}
# Error in { : task 1 failed - "cannot open the connection"
stopCluster(cl)
Here is the output from Out_message.txt. Note the "Permission Denied" on the far right.
starting worker pid=6852 on localhost:11611 at 22:09:19.488
starting worker pid=6984 on localhost:11611 at 22:09:19.926
starting worker pid=3384 on localhost:11611 at 22:09:20.441
Warning message:
Warning message:
In file(con, "r") :
cannot open file 'C:\Users\JESSEW~1\AppData\Local\Temp\Rtmp6dEZLP\file1ac44a506032': Permission denied
In file(con, "r") :
cannot open file 'C:\Users\JESSEW~1\AppData\Local\Temp\RtmpuydRvR\file1b48185f2a2d': Permission denied
Warning message:
In file(con, "r") :
cannot open file 'C:\Users\JESSEW~1\AppData\Local\Temp\RtmpAbOIng\filed382ef37d51': Permission denied

Resources