How to interact with R from VBA? [closed] - r

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Suppose I have VBA macro in Excel which does some calculation. And I would like to do a part of this calculation in R, but in program mode. Say, at some moment Excel macro has a vector and it needs to find its mean by mean function in R. How can I call R from VBA, transmit a vector to R, initiate the calculation in R and get back the result in VBA? Thanks.

There is a plugin RExcel, but I found quite horrible to use it (and you kinda have to pay for it).
The easiest and most general but hacky way to perform your interaction is the following:
1) Save your array/matrix/vector in csv in a folder
2) write your R code in a file that read the csv and write the result in csv
3) Call the R script from VBA with the VBA Shell function (Rscript scriptName.R)
4) Import the result back to excel/VBA.
This method has the advantage that you are separating the computational logic with the formatting from VBA.
You could also call the R code directly within VBA with -e option from R but this is strongly unadvised.
Hope it helps!
BTW: it works with all the other program (Python/LaTeX/Matlab).

Related

Extract table data from PDF that is formatted as picture [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 4 years ago.
Improve this question
I am trying to extract the data in the tables that start on p.52 of this document (a report from FAA).
The problem is that the tables are included as pictures. Any chance I can get some pointers on how to do that without doing it manually?
I have tried converting it to text using Adobe's OCR function, and I have also tried using the extract_tables function in R's tabulized package.
I could of course do this manually, but it would be good to know if there is a more efficient way of doing it.
It's possible, however its accuracy depends on the image. I always use grayscale images. Here an example of available tools. In your case, I'd suggest you take some screenshots of the tables and use the OCRFeeder to compare the results from GOCR and Tesseract.
sudo apt-get install gocr tesseract-ocr ocrfeeder
ocrfeeder -i image.jpg
After some manual checks, you can import this file in LibreOffice Calc, save it as 'csv', and import in R.

Why .RData when .R is sufficient [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 5 years ago.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Improve this question
Say, if we can save and load an exact same data from .R files, then why came the need for .RData. I tried figuring out some explanation from [R] foo.RData or foo.r?. So, I stumbled upon few queries:
Does .RData saves only final result or complete code just a .R scripts?
What is their exact relevance? Which one to prefer over other and when?
RData saves objects, not scripts — if you load it, you load objects inside your environment. It does not contain the code used to produce these elements.
A .R is a script without any object in it — if you open it, you'll see code and you'll need to source it to get the objects produced by the .R.
I would advice to use them this way
.R : store functions, and scripts used to create an object (for the sake of reproducibility, for example in /data-raw in packages)
use .RData to store objects you'll need later
This is basically how a package works : a /R folder with functions, and a /data folder containing the data objects necessary to the package.
In .R file you can save R code, in .RData file you can save data structures from R, eg vector, matrix, data frame or linear model.

Glicko-2 implementation in R, where to find? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 3 years ago.
Improve this question
I am looking for an R implemention of the excellent Glicko-2 algorithm of Mark Glickman. Thusfar I found this one. Although this is a very nice piece of code I am particularly looking for a code that is able to deal with large data frames with match scores (meaning that it is capable of ranking all the players in the data frame in one go). A bit like the way the PlayerRatings package does the trick with e.g. Elo, Glicko. Unfortenately this package doesn't haven an implementation of the Glicko-2 algorithm.
Does anyone have an idea?
Glicko2 and few other algorithms are available in R package sport. Possible for two-player and multi-player matchups. Available on cran and github. Vignette included, standarized syntax, supported by C++.
Quick snippet
# install.packages("sport")
library(sport)
glicko2 <- glicko2_run(formula = rank|id ~ rider, data = gpheats)
# computation results
print(glicko2)
summary(glicko2)
tail(glicko2$r)
tail(glicko2$pairs)
If you had noticed the fine print at the bottom of Mark Glickman's page you would have seen (in tiny text admittedly)
PlayerRatings, an R package implementation of Glicko, as well as a
few other rating systems
with the link being: https://cran.r-project.org/web/packages/PlayerRatings/

SAS program equivalent in R [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I'm new to R and I'm wondering if R has something similar to SAS program where it can store the codes? I need to run the analysis on my data set (update monthly) every month. In SAS, I can just run the program and it will give me the results. I'd like to know if R has something similar to that? Thank you so much!
[Update] Thanks for the answers. R script is what I'm looking for!
Are you just talking about running an R script?? If you have a text file called codefile.R containing R code, then from within an interactive R session source("codefile.R") will run it. Or you can use R CMD BATCH codefile.R from a command line/shell/terminal.
update: Dirk Eddelbuettel points out that Rscript or the littler package are recommended over R CMD BATCH ...
Yes. Just like you can create SAS programs in the enhanced editor, you can create R scripts in R. This process is even easier, in my opinion, when you write your R scripts using the tools that come with R Studio.

R: Lapply equivalent for Revoscaler/Revolution Enterprise? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Have Revolution Enterprise. Want to run 2 simple but computationally intensive operations on each of 121k files in a directory, outputting to new files. Was hoping to use some Revoscaler function that chunked/parallel processed the data similarly to lapply. So I'd have lapply(list of files, function), but using a faster Rxdf (revoscaler) function that might actually finish, since I suspect basic lapply would never complete.
So is there a Revoscaler version of lapply? Will running it from Revolution Enterprise automatically chunk things?
I see parlapply, mclapply (http://www.inside-r.org/r-doc/parallel/clusterApply)...can I run these using cores on the same desktop? Aws servers? Do I get anything out of running these packages in Revoscaler if its not a native Rxdf function? I guess then that this is a question more on what I can use as a "cluster" in this situation.
There is rxExec, which behaves like lapply in the single-core scenario, and like parLapply in the multi-core/multi-process scenario. You would use it like this:
# vector of file names to operate on
files <- list.files()
rxSetComputeContext("localpar")
rxExec(function(fname) {
...
}, fname=rxElemArg(files))
Here, func is the function that carries out the operations you want on the files; and you pass it to rxExec much like you would to lapply. The rxElemArg function tells rxExec to execute func on each of the different values of files. Setting the compute context to "localpar" starts up a local cluster of slave processes, so the operations will run in parallel. By default, the number of slaves is 4, but you can change this with rxOptions(numCoresToUse).
How much speedup can you expect to get? That depends on your data. If your files are small and most of the time is taken up by computations, then doing things in parallel can get you a big speedup. However, if your files are large, then you may run into I/O bottlenecks especially if all the files are on the same hard disk.

Resources