How do I run a function every second - r

I want to run a function that takes less than one second to execute. I want to run it in a loop every second. I do not want to wait one second between running the function like Sys.sleep would do.
while(TRUE){
# my function that takes less than a second to run
Sys.sleep(runif(1, min=0, max=.8))
# wait for the remaining time until the next execution...
# something here
}
I could record a starttime <- Sys.time() and do a comparison every iteration through the loop, something like this...
starttime <- Sys.time()
while(TRUE){
if(abs(as.numeric(Sys.time() - starttime) %% 1) < .001){
# my function that takes less than a second to run
Sys.sleep(runif(1, min=0, max=.8))
print(paste("it ran", Sys.time()))
}
}
But my function never seems to be executed.
I know python has a package to do this sort of thing. Does R also have one that I don't know about? Thanks.

You can keep track of the time with system.time
while(TRUE)
{
s = system.time(Sys.sleep(runif(1, min = 0, max = 0.8)))
Sys.sleep(1 - s[3]) #basically sleep for whatever is left of the second
}
You can also use proc.time directly (which system.time calls), which for some reasons got better results for me:
> system.time(
for(i in 1:10)
{
p1 = proc.time()
Sys.sleep(runif(1, min = 0, max = 0.8))
p2 = proc.time() - p1
Sys.sleep(1 - p2[3]) #basically sleep for whatever is left of the second
})
user system elapsed
0.00 0.00 10.02

Here are some alternatives. These do not block. That is you can still use the console to run other code while they are running.
1) tcltk Try after in the tcltk package:
library(tcltk)
run <- function () {
.id <<- tcl("after", 1000, run) # after 1000 ms execute run() again
cat(as.character(.id), "\n") # replace with your code
}
run()
Running this on a fresh R session gives:
after#0
after#1
after#2
after#3
after#4
after#5
after#6
after#7
...etc...
To stop it tcl("after", "cancel", .id) .
2) tcltk2 Another possibility is tclTaskSchedule in the tcltk2 package:
library(tcltk2)
test <- function() cat("Hello\n") # replace with your function
tclTaskSchedule(1000, test(), id = "test", redo = TRUE)
Stop it with:
tclTaskDelete("test")
or redo= can specify the number of times it should run.

Another non-blocking alternative worth mentioning is provided by library(later), via using later() recursive:
print_time = function(interval = 10) {
timestamp()
later::later(print_time, interval)
}
print_time()
The example is taken from here.

The shiny package has a function invalidateLater() which can be use to trigger functions. Have a look at http://shiny.rstudio.com/gallery/timer.html

Although its very late.
As an alternative we can use recursion. I don't know if its the solution you are looking for. But it executes function at regular interval.
ssc <- function(){
x <- rnorm(30,20,2)
print(hist(x))
Sys.sleep(4)
ssc()
}
ssc()

Related

How can we build a timer for 5 different player using closure in R

There are 4 separate function for starting, ending, get their duration and display their result in desc order which needs to be implemented using closure. I was trying to implement three functions first :-
StopWatch <- function(){
list(strt<-function(Name = "name") Start <<- Sys.time(),
stop<-function() End <<- Sys.time(),
duration<-function(){ t <<- Start- End
print(t)})
}
w<- StopWatch()
w$strt("player1")
sleep_for_a_minute()
w$stop()
w$duration()
When I run w$strt("player1") it give an Error: attempt to apply non-function
The list you create in your function isn't named. Check by running names(w). You should use = instead of <-
As #Tyler Smith already wrote, you need =, because you want to assign names to list elements here and not really define functions. Also you're attempting to overwrite the t() function (which exemplarily here yields an error), use something different.
StopWatch <- function() {
list(strt=function(Name = "name") Start <<- Sys.time(),
stop=function() End <<- Sys.time(),
duration=function() {
tm <<- Start - End
print(tm)
})
}
w <- StopWatch()
w$strt("player1")
Sys.sleep(1)
w$stop()
w$duration()
# Time difference of -2.27513 secs

How to use withTimeout function to interrupt expression if it takes too long

I would like to terminate some code if a computation takes too long, i.e., it takes more than 2 seconds. I am trying to use the withTimeout function. Reading the example in the help, the following code is working and I get an error:
foo <- function() {
print("Tic")
for (kk in 1:100) {
print(kk)
Sys.sleep(0.1)
}
print("Tac")
}
res <- withTimeout({foo()}, timeout = 2)
I tried to replicate this logic writing the following code, but it does not work, i.e., the computation ends even if the timeout has passed (on my laptop, it takes more or less 10 seconds).
res <- withTimeout({rnorm(100000000)}, timeout = 2)
Does anyone know why?
The rnorm example is a known "issue", which you can find on the R.utils GitHub site as a non-supported case.
You can make this work by doing
foo1 <- function(n = 1000000) {
ret <- rep(0, n);
for (kk in 1:n) ret[kk] <- rnorm(1);
ret;
}
# The following will time out after 2s
tryCatch( { res <- withTimeout( { foo1() },
timeout = 2) },
TimeoutException = function(ex) cat("Timed out\n"))
#Timed out
# Confirm that res is empty
res
#NULL

Parallelized `Find` loop in R

There are several packages in R to simplify running code in parallel, like foreach and future. Most of these have constructs which are like lapply or a for loop: they carry on until all the tasks have finished.
Is there a simple parallel version of Find? That is, I would like to run several tasks in parallel. I don't need all of them to finish, I just need to get the first one that finishes (maybe with a particular result). After that the other tasks can be killed, or left to finish on their own.
Conceptual code:
hunt_needle <- function (x, y) x %in% (y-1000):y
x <- sample.int(1000000, 1)
result <- parallel_find(seq(1000, 1000000, 1000), hunt_needle)
# should return the first value for which hunt_needle is true
You can use shared memory so that processes can communicate with one another.
For that, you can use package bigstatsr (disclaimer: I'm the author).
Choose a block size and do:
# devtools::install_github("privefl/bigstatsr")
library(bigstatsr)
# Data example
cond <- logical(1e6)
cond[sample(length(cond), size = 1)] <- TRUE
ind.block <- bigstatsr:::CutBySize(length(cond), block.size = 1000)
cl <- parallel::makeCluster(nb_cores())
doParallel::registerDoParallel(cl)
# This value (in an on-disk matrix) is shared by processes
found_it <- FBM(1, 1, type = "integer", init = 0L)
library(foreach)
res <- foreach(ic = sample(rows_along(ind.block)), .combine = 'c') %dopar% {
if (found_it[1]) return(NULL)
ind <- bigstatsr:::seq2(ind.block[ic, ])
find <- which(cond[ind])
if (length(find)) {
found_it[1] <- 1L
return(ind[find[1]])
} else {
return(NULL)
}
}
parallel::stopCluster(cl)
# Verification
all.equal(res, which(cond))
Basically, when a solution is found, you don't need to do some computations anymore, and others know it because you put a 1 in found_it which is shared between all processes.
As your question is not reproducible and I don't understand everything you need, you may have to adapt this solution a little bit.

how to track progress in mclapply in R in parallel package

My question is related to this question. However the question referenced above uses multicore package which was replaced by parallel. Most of the functions in the response cannot be replicated in the parallel package. Is there a way to track progress in mclapply. In looking at the mclapply documentation, there is a parameter called mc.silent, I'm not sure if this would be able to track progress, and if so how and where we can see the log file? I'm running on ubuntu linux OS. Please see below for a reproducible example for which I would like to tack progress.
require(parallel)
wait.then.square <- function(xx){
# Wait for one second
Sys.sleep(2);
# Square the argument
xx^2 }
output <- mclapply( 1:10, wait.then.square, mc.cores=4,mc.silent=FALSE)
Any help would be greatly appreciated.
Thanks to the package pbmcapply you can now easily track progress of mclapply and mcmapply jobs. Just replace mclapply by pbmclapply:
wait.then.square <- function(xx) {
Sys.sleep(2)
xx^2
}
library(pbmcapply)
output <- pbmclapply(1:10, wait.then.square, mc.cores = 4)
...which will display a pretty progress bar.
The author has a nice blog post on the technical details and performance benchmarks here.
This is an update of my related answer.
library(parallel)
finalResult <- local({
f <- fifo(tempfile(), open="w+b", blocking=T)
if (inherits(parallel:::mcfork(), "masterProcess")) {
# Child
progress <- 0.0
while (progress < 1 && !isIncomplete(f)) {
msg <- readBin(f, "double")
progress <- progress + as.numeric(msg)
cat(sprintf("Progress: %.2f%%\n", progress * 100))
}
parallel:::mcexit()
}
numJobs <- 100
result <- mclapply(1:numJobs, function(...) {
# Do something fancy here... For this example, just sleep
Sys.sleep(0.05)
# Send progress update
writeBin(1/numJobs, f)
# Some arbitrary result
sample(1000, 1)
})
close(f)
result
})
cat("Done\n")

run a function for specified time in R

I'm trying to get a function to run for a specified amount of time, at the moment I'm trying to use the system.time function. I can't figure out how to define a new variable that takes on cumulative value the function running, then put it into a while loop.
timer<-(system.time(simulated_results<-replicate(n=1,simulation(J,10000,FALSE,0.1),simplify="vector"))[3])
print(timer)
while(cumsum(timer)<15){
print(cumsum(timer))
simulated_results<-replicate(n=10000,simulation(J,10000,FALSE,0.1),simplify="vector")
}
I would greatly appreciate any help!!!
If you want to run some code for a specified number of seconds, you can try the following :
start <- as.numeric(Sys.time())
duration <- 5
results <- NULL
while(as.numeric(Sys.time())-start < duration) {
results <- c(results, replicate(...))
}
Of course, you have to change the value of duration (in seconds), and replace replicate(...) with your code.
You can use tryCatch approach for this task. For example, consider the following code
fun_test = function(test_parameter){
result <- 1+test_parameter #some execution
return(result)
}
time = 10 #seconds
res <- NULL
tryCatch({
res <- withTimeout({
check = fun_test(tsp)
}, timeout = time)
}, TimeoutException = function(ex) {
message("Timeout. Skipping.")
})
This program will run the function fun_test for 10 seconds. If the execution is successful in this time, the result is returned, else program is stoped. For more guidance, you can follow this URL
Time out an R command via something like try()

Resources