Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 1 year ago.
Improve this question
Is there a way (perhaps a website) where one can run R online in a Linux environment?
motivation: I develop a few packages in R and oftentimes I need to run tests in Linux. However, I use a Windows OS and don't want to go through the hassle of learning Linux to install it locally.
A few suggestions:
Install docker to be able to have a 'virtual' Linux on your windows computer. That is essentially unlimited use on your own machine allowing you to learn and test.
You can also go to rstudio.cloud to run a few hours of R within RStudio (Cloud) per month for free. If you need more hours, you can purchase them. Possibly easiest immediate approach but with a usage cap.
Similarly Google Colab has an ability to run R in the notebooks, but it still somewhat hidden. One source with tips is this SO answer.
If you want to / can test in batch mode, then RHub is good. There is also a CRAN package rhub to interact with it. You need to create a token; this is documented.
Last but not least CI providers let you run on their systems. GitHub Actions is popular and supports many operating systems and variants. GitLab had something similar much earlier too. My r-ci setup aims to facilitate this without tieing you to a CI provider "forver". If you just want GitHub Actions, follow one of the many tutorials for it.
Both Rstudio cloud and rdrr.io/snippets use linux (according to Sys.info())
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed last month.
Improve this question
I am interested in running a multi-node parallel job with AWS ParallelCluster in R. Is there any useful documentation, guide or R package helping with it? As far as I understand library(paws) does not support that service. Thank you.
It looks like the PAWS library is a client for individual services in AWS. However, AWS ParallelCluster is actually a downloadable Python package that helps you orchestrate multiple AWS services together into a Slurm-powered, dynamically scaling HPC cluster in the cloud.
Once you've configured your cloud HPC system using ParallelCluster, you can log into it using SSH or AWS Systems Manager and interact with it like any other Slurm cluster you might have experience with.
At a high-level, your roadmap looks like this:
Install ParallelCluster
Design and configure your HPC cluster
Log into your cluster and install R in the shared $HOME directory
Run your multi-node parallel R job using a Slurm batch script
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
I am currently have a R query that can do the parallel processing within a loop using foreach. But it is done using a single server with 32 cores. Because of my data size, I am trying to find the r packages that can distribute the computing to different window servers and can work with foreach for paralleling.
Really appreciate for your help!
For several releases now, R has been shipping with a base library parallel. You could do much worse than starting to read its rather excellent (and still short) pdf vignette.
In a nutshell, you can just do something like
mclapply(1:nCores, someFunction())
and the function someFunction() will be run in parallel over nCores. A default value of half your physical cores may be a good start.
The Task View on High-Performance Computing has many more pointers.
SparkR is the answer. From "Announcing SparkR: R on Apache Spark":
SparkR, an R package initially developed at the AMPLab, provides an R frontend to Apache Spark and using Spark’s distributed computation engine allows us to run large scale data analysis from the R shell.
Also see SparkR (R on Spark).
To get started you need to set up a Spark cluster. This web page should help. The Spark documentation, without using Mesos or YARN as your cluster manager, is here. Once you have Spark set up, see Wendy Yu's tutorial on SparkR. She also shows how to integrate H20 with Spark which is referred to as 'Sparkling Water'.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
Having done a number of projects in python and node.js, I am missing an interactive interpreter/console in dart that those languages, as well as other like ruby, provide so nicely.
Tests, logging, debuggers, profilers are the instruments we use in application development while digging into issues or trying things out. But in scripting, in server side scripting, it is the interpreter that is the primary tool to try things out. Having experience from both, software development with a long list of languages and a number of scripting domains (python and bash for admin functions, node for http request evaluation, R for data analysis etc.), I fail to see how, if server side scripting is to be taken seriously, any language not providing an interpreter/console can hope for any sensible share of the pie.
Is dart not intended for scripting or am I just missing something obvious?
PS. There is (was) one project addressing the issue, but it did not see any development for the past 3 years: https://github.com/sam-mccall/dart-console
As far as I know the REPL for the Dart language was not originally planned by the development team. The discussion on the REPL took place back in 2012 with no real outcome:
Github: Dart needs a REPL
So the answer is, there exists no interactive interpreter/console for Dart and it does not look like there are any plans to create one.
Observaory, Dartium, and WebStorm debugger allow to interactively execute Dart code.
REPL for dartlang
Is there anyway to invoke a Dart REPL on a website, when using Dartium?
Is there an "Immediate Window" in the Dart Editor / Debugger?
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am planning to run Monte Carlo simulations in an R environment (Windows 7). However I need to use old legacy statistical packages that are no longer executable in Windows 7, although I am aware that there are emulation solutions (Like VMWare) available. In addition I need to integrate these packages in a seamless workflow so that simulated data from R functions is pushed on the old package, processed, and pulled back for further analysis in R again.
I am aware that there are open source workflow tools (such as KNIME) that can integrate different software packages, but my internet searches tend to be swamped with references to WorkFlow Management software business applications that are irrelevant to me.
Is KNIME Analytics a suitable solution given my legacy software problem, and if not what workflow tool would you suggest?
Kind regards,
Giulio Flore
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I learnt from the web that Revolution R allows multi-threading and optimize running of my R-scripts.
My question is: after installation of Revolution R, if I run my R-script under Revolution R environment, will it automatically optimize running of my R-script? Or I need to modify my R-script in order to allow Revolution R to optimize running of my R-script?
Thanks a lot.
I think your terminology may need some refinement. You may need to distinguish multi-processing from multi-threading. Revolution R does link to a multithreaded BLAS library for Windows that might otherwise not be available unless you compiled your version. Whether or not that will improve your performance is apparently somewhat dependent on what functions you use.
To use multi-processing in R, you will need set up your machine resources appropriately and then use code that distributes the parallizable tasks. Those seem to be the applications you are thinking about when you ask about modifying your scripts. Revo-R used to have advantages here over regular R, but for the last couple of versions, the 'parallel' package has been available to all useRs.
Revo R has multithreaded BLAS, this does not require a change in your scripts.
And GNU R, or Standard R, can of course also use multithreaded BLAS as detailed in Appendix A.3.1 of the R Installation and Administration manual.