How to generate and save random variable names in Classic asp - asp-classic

So i am working on a project where i get list of ID's and i have to execute 10 different queries for each id. Now the catch is that, we have a code written to manage it for one ID.
its like:
DBConnect accDB1,sql1,A
DBConnect accDB1,sql2,B
DBConnect accDB1,sql3,C
DBConnect accDB1,sql4,D
DBConnect accDB1,sql5,E
Now for N number of Id's I need some thing like
for(i=0;i<N;i++)
{
DBConnect accDB1,sql1,A //A0 for i=0, A1 for i=1....
DBConnect accDB1,sql2,B
DBConnect accDB1,sql3,C
DBConnect accDB1,sql4,D
DBConnect accDB1,sql5,E
}
Since first we are executing all queries for all ID's and then displaying every thing, i cant display it in loop.
What would be approach to solve this problem.

Set your a, b, c, etc variables to be arrays of size N, then you can store the results at their appropriate index and later loop through for display:
for(i=0; i<N; i++)
{
DBConnect accDB1, sql1, A[i]
DBConnect accDB1, sql2, B[i]
DBConnect accDB1, sql3, C[i]
DBConnect accDB1, sql4, D[i]
DBConnect accDB1, sql5, E[i]
}

Related

How to find / modify objects directly on parallel workers in R

I have an expensive problem I'm trying to split into pieces.
It's an optimization problem, and consists of an initial expensive setup step, followed by a recursive structure, such that the workers can only perform one step at a time before the results need to be collected, and a new task sent to the workers.
A complicating feature is that an initial setup step for the sub computations that should occur on each worker, has to be performed directly on each worker, and cannot be exported to the worker via clusterExport or similar.
I had hoped to be able to use clusterApply to assign the outcome of this initial setup to be stored on the specific worker, but can't seem to achieve this.
The first part of my code below shows my current attempts and describes what I would like, the second shows an attempt to see all objects available on the worker and where they are located.
library(parallel)
### What I would like to do:
test2<-function(){
MYOBJECT <-0
cl=makeCluster(2,type='PSOCK')
clusterExport(cl,c('MYOBJECT'),envir = environment())
clusterApply(cl,1:2,function(x) { #attempt to modify / create MYOBJECT on the worker processes
y <- x * 2 #expensive operation I only want to do once, that *cannot* be exported to the worker
MYOBJECT <<- y
MYOBJECT <- y
assign('MYOBJECT',y,envir = parent.frame()) #envs[[1]])
})
clusterApply(cl,1:2,function(x) MYOBJECT * .5) #cheap operation to be done many times
}
test2() #should return a list of 1 and 2, without assignment into the test2 function environment / re exporting
#trying to find out where MYOBJECT is on the worker
test<-function(){
MYOBJECT <-1
cl=makeCluster(1,type='PSOCK')
clusterExport(cl,c('MYOBJECT'),envir = environment())
clusterApply(cl,1,function(x) {
MYOBJECT <<- list('hello')
assign('MYOBJECT',list('hellohello'),envir = parent.frame()) #envs[[1]])
})
clusterApply(cl,1,function(x)
lapply(sys.frames(),ls) #where is MYOBJECT?
)
}
test()
Simple solution in the end -- to modify the contents of individual workers in a persistent manner, the assignment within the clusterApply function needs to be made to the global environment.
library(parallel)
### What I would like to do:
test2<-function(){
MYOBJECT <-0
cl=makeCluster(2,type='PSOCK')
clusterExport(cl,c('MYOBJECT'),envir = environment())
clusterApply(cl,1:2,function(x) { #attempt to modify / create MYOBJECT on the worker processes
y <- x * 2 #expensive operation I only want to do once, that *cannot* be exported to the worker
assign('MYOBJECT2',y,envir = globalenv()) #envs[[1]])
})
clusterApply(cl,1:2,function(x) MYOBJECT2 * .5) #cheap operation to be done many times
}
test2() #should return a list of 1 and 2, without assignment into the test2 function environment / re exporting

Foreach does not perform IF statements

I've implemented foreach in one of the multiple for statements of my R code. It returns the main result (the one after all the iterations), however, it does not perform an IF statement within the code.
Below the skeleton of my code (it's too long to put everything).The if statement does not work and variable "Disc_Time" remains the same (as initialized). What I'm doing wrong or missing? I've tried with .export="f" and .export=ls(GlovalEnv) without success.
library(foreach)
library(doParallel)
cores=detectCores()
cl <- makeCluster(cores[1]-1) #not to overload your computer
registerDoParallel(cl)
Disc_Time<-c("UE","Beam_order","Time")
.... MORE VARIABLES
MDP_x<-foreach (d = 1:length(dista),.combine='c')%dopar%
{
for (q in 1:sim)
{
for (ue in 1:n)
{
for (i in 1:length(seq_order_BS))
{
for (j in 1:length(seq_order_UE))
{
if(first==0)
{
Disc_Time<-rbind(Disc_Time,c(ue,i,D_Time))
}
}
}
}
}
stopcluster(cl)
To see if your if statement is working, we need to know how first is set, and what value it has before your loop. It does not look like first changes within your loop anyway, so it really should sit outside your %dopar% statement anyway.
That said, the if statement is not your issue. foreach returns a list of each return of the expression within. For example:
ls <- foreach(d = 1:5) %dopar% {
return(i)
}
gives an list ls that contains the numbers one to five.
The only expression in your function is an assignment call to Disc_Time. This is evaluated within each of your nodes, and never returned to the parent environment. Disc_Time is never changed where the code was called from).
It looks as though you are trying to set a side effect of your parallel function (to change Disc_Time), which to my knowledge is not possible in a parallel context. Perhaps you want:
MDP_x<-foreach (d = 1:length(dista),.combine='c')%dopar%
{
for (q in 1:sim)
{
for (ue in 1:n)
{
for (i in 1:length(seq_order_BS))
{
for (j in 1:length(seq_order_UE))
{
if(first==0)
{
return(rbind(Disc_Time,c(ue,i,D_Time)))
} else {
return(NA)
}
}
}
}
}
stopcluster(cl)
MDP_x should then have the values you want for each d

Run different function in different threads/task in R

Does R have any mechanism to run different calculation in different threads (Windows-like mechanism of threads/tasks)? Let's
func1 <- function(x) { return (x^2); }
func2 <- function(y) { return (y^3); }
I need to execute something like this (imagine code):
thread1 <- thread_run(func1);
thread2 <- thread_run(func2);
with same mechanism of synchronization, like:
wait(thread1);
wait(thread2);
You can do that with the future package
install.packages(future)
library(future)
And then just use your code and just change the assigment to
thread1 %<-% thread_run(func1);
thread2 %<-% thread_run(func2);
Here more to read: http://www.r-bloggers.com/a-future-for-r-slides-from-user-2016/

run a for loop in parallel in R

I have a for loop that is something like this:
for (i=1:150000) {
tempMatrix = {}
tempMatrix = functionThatDoesSomething() #calling a function
finalMatrix = cbind(finalMatrix, tempMatrix)
}
Could you tell me how to make this parallel ?
I tried this based on an example online, but am not sure if the syntax is correct. It also didn't increase the speed much.
finalMatrix = foreach(i=1:150000, .combine=cbind) %dopar% {
tempMatrix = {}
tempMatrix = functionThatDoesSomething() #calling a function
cbind(finalMatrix, tempMatrix)
}
Thanks for your feedback. I did look up parallel after I posted this question.
Finally after a few tries, I got it running. I have added the code below in case it is useful to others
library(foreach)
library(doParallel)
#setup parallel backend to use many processors
cores=detectCores()
cl <- makeCluster(cores[1]-1) #not to overload your computer
registerDoParallel(cl)
finalMatrix <- foreach(i=1:150000, .combine=cbind) %dopar% {
tempMatrix = functionThatDoesSomething() #calling a function
#do other things if you want
tempMatrix #Equivalent to finalMatrix = cbind(finalMatrix, tempMatrix)
}
#stop cluster
stopCluster(cl)
Note - I must add a note that if the user allocates too many processes, then user may get this error: Error in serialize(data, node$con) : error writing to connection
Note - If .combine in the foreach statement is rbind , then the final object returned would have been created by appending output of each loop row-wise.
Hope this is useful for folks trying out parallel processing in R for the first time like me.
References:
http://www.r-bloggers.com/parallel-r-loops-for-windows-and-linux/
https://beckmw.wordpress.com/2014/01/21/a-brief-foray-into-parallel-processing-with-r/

Calling user specified R function in inline C++ body

I have been working with the R package "RcppArmadillo". I already used it to define two cxxfunction(they have been debugged are fine to use):
calc1 <- cxxfunction(signature(A="integer", B="integer"),...)
calc2 <- cxxfunction(signature(A="integer", K="integer"),...)
Now I'm writing the body part of another cxxfunction main and wish to call calc1 and calc2 within the for loops there, like:
body_main = '
...
for(int i=0; i<N; i++){
// This is where I want to call calc1.
// (?)
for(int j=0; j<N; j++){
// This is where I want to call calc2.
// (?)
}
}
'
Is there anyway that I can achieve that? Can that be done in an inline fashion?
I haven't seen an example of inline usage of RcppArmadillo(or Rcpp, RcppGSL) in which people write a subroutine within the body part - specifically, I mean code looks like this:
body_example = '
// Subroutine
SEXP(/*or something else*/) func_0(SEXP A, SEXP B){
...
return ...;
}
// Then call it from the main part
...
AB = func_0(A, B);
...
'
My question probably looks naive but it haunts me nevertheless. Can anyone help explain this? I'd appreciate that a lot!
You could switch from using cxxfunction() from package inline to using Rcpp attributes and its sourceCpp(). That way you get the predictable function headers at the C++ level, see the Rcpp atributes vignette.
Or split calc1 and calc2 into 'worker' and 'wrapper', have cxxfunction() around the wrapper allowing you to call the worker.
The key issue here really is that cxxfunction() exists to create an R-callable function, and it generates internal randomized function headers.
Lastly, a package would help too.

Resources