web based interpreter for language R [closed] - r

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I am looking for a web based interpreter for the language R.
To be more precise , i am looking for a IDE like http://codepad.org/ where i can provide the code and the server should execute and provide me with the output.
I went through applications like Rapache but then they don't fit my requirement as they are not made to accept code from client , execute it and provide the result.
In short , i could find web application which takes input from the user , execute a specific R script and then place the output in a neatly formated way but not a web application which accepts R code ,execute it and then place it in a neat way.

A few possibilities come to mind:
ideone provides a lot of different languages, of which, R is one of them. When you run a script, you are provided with a link that you can embed in a webpage (but which doesn't show the output, unfortunately). If you create an account, you can also store your previously run scripts.
Pro: You can easily insert /plain/ into your script and be able to get a URL that can be sourced directly in R. For example, if the URL for your script online is "http://ideone.com/PIkeD", then you can use source("http://ideone.com/plain/PIkeD") to load your script directly from the ideone servers.
Cons: Stuck at version 2.11 Might not always be the most current version of R. Presently at 3.2.2. Can't install other packages. Output doesn't show in the embed script provided.
Cloudstat console runs a more recent version of R (2.15.1) with quite a few commonly used package. It used to have a really interesting blog/notebook interface that integrated code and the output, but that doesn't seem to be available at the moment.
Pro: Useful for running something fairly straightforward in a pinch.
Cons: Can't install other packages. Output is not formatted in code blocks, so is not easily readable. At the moment, can't save or share the code you've run.
Crunch offers a full RStudio setup, runs the most recent version of R, and allows you to install the packages you need. This may be more convenient than having to install your own RStudio server. You do have to request an account though.
Pros: Pretty much all you would expect from R/RStudio. Allows you to use Sweave and R markdown to automatically create documents too. These documents can be publicly hosted too. Here's an example where I've placed a page in a public folder called "gallery": http://crunch.kmi.open.ac.uk/people/~mrdwab/gallery/howzat.html
Cons: Sometimes the loading time is a bit slow, but as I am running RStudio desktop, I don't know how Crunch compares to running my own RStudio server.
Updated January 10, 2014
Recently, there has also been a decent amount of buzz around R-Fiddle as an interesting way to share R code. It looks like it is what powers the awesome http://www.rdocumentation.org/ site.

RStudio IDE (Server) may be the answer to your question. Have a look at http://www.rstudio.com/ide/

You can try Rcloud which we are developing in AT&T research lab. Its a open source IDE like Rstudio/IPYthon and has more advanced capabilities in terms of collaboration.
https://github.com/att/rcloud
RCloud is an environment for collaboratively creating and sharing data analysis scripts. RCloud lets you mix analysis code in R, HTML5, Markdown, Python, and others. Much like Sage, iPython notebooks and Mathematica, RCloud provides a notebook interface that lets you easily record a session and annotate it with text, equations, and supporting images.

Related

Correct way to create a software install script which can manage dependencies

I'm currently working on an university research related software which uses statistical models in it in order to process some calculations around Item Response Theory. The entire source code was written in Go, whereas it communicates with a Rscript server to run scripts written in R and return the generated results. As expected, the software itself has some dependencies needed to work properly (one of them, as seen before, is to have R/Rscript installed and some of its packages).
Due to the fact I'm new to software development, I can't find a proper way to manage all these dependencies on Windows or Linux (but I'm prioritizing Windows right now). What I was thinking is to have a kind of script which checks if [for example] R is properly installed and, if so, if each used package is also installed. If everything went well, then the software could be installed without further problems.
My question is what's the best way to do anything like that and if it's possible to do the same for other possible dependencies, such as Python, Go and some of its libraries. I'm also open to hear suggestions if installing programming languages locally on the machine isn't the proper way to manage software dependencies, or if there's a most convenient way to do it aside from creating a script.
Sorry if any needed information is missing, I would also like to know.
Thanks in advance

Best Practise Coding for R script running in production [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
We have a linux production server and a number of scripts we are writing that we want to run on it to collect data which then will be put into a Spark data lake.
My background is SQL Server / Fortran and there are very specific best practices that should be followed.
Production environments should be stable in terms of version control, both from the code point of view, but also the installed applications, operating system, etc.
Changes to code/applications/operating system should be done either in a separate environment or in a way that is controlled and can be backed out.
If a second environment exist, then the possibility of parallel execution to test system changes can be performed.
(Largely), developers are restricted from changing the production environment
In reviewing the R code, there are a number of things that I have questions on.
library(), install.packages() - I would like to exclude the possibility of installing newer versions of packages each time scripts are run?
how is it best to call R packages that are scheduled through a CRON job? There are a number of choices here.
When using RSelenium what is the most efficient way to use a gui/web browser or virtualised web browser?
In any case I would scratch any notion of updating the packages automatically. Expect the maintainers of the packages you rely on to introduce backward incompatible changes. Your code will stop working out of the blue if you auto update. Do not assume anything sacred.
Past that you need to ask yourself how much hands on is your deployment. If you're OK with manually setting up each deployment then you can probably get away using the packrat package to pull down and keep sources of the exact versions you are using. This way reproducing your deployment is painful, but at least possible. If you want fully automated reproducible deployments I suggest you start building docker images with your packages and tagging them with dates or versions.
If you make no provisions for reproducing your environment you are asking for trouble, while it may seem OK at first to simply fix any incompatibilities as they come up with updates, and does indeed seem to be the official workflow from the powers that be, however misguided that is; eventually as your codebase grows that will be all you will end up doing.

Deploying an R app with a GUI [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 9 years ago.
I developed an R application and I want to deploy it.
Currently the application consists of a set of functions to be run from the command line, like an R package. In order to deploy it, I am thinking of repackaging R Portable adding the necessary libraries and my code to it. My main problem is choosing a proper GUI toolkit.
Production Environment
My app is a single-user one (i.e. a Desktop application) and the target platform is Windows. It could bootstrap in R and then call the toolkit, or bootstrap, say, in Java and then call the R engine. The GUI should first and foremost feed the app functions. It should also grab the function graphical output.
Possible Alternatives
Here is a potential list of alternatives. I’d like to know if they meet/fit the requisite environement described.
Java JRI is now released only as a part of rJava, but while the latter is clearly documented, I am unable to find docs and tutorials for the former.
As for Deducer, it is presented as a GUI front-end, but I found out that it is also a GUI toolkit
TCL/Tk bindings seem a natural choice for R and are well documented, but someone complains about limitations of this toolkit.
RGtk2 seems interesting and there are also some tutorials around.
gWidgets is one of the rare toolkits to sport a package vignette!
Despite I don’t need a real web application, an interesting option would be interfacing R with JavaScript/HTML. As most of us, I am familiar with this environment and the app could benefit of the many JS libraries.
The problem is that the beautiful Shiny server and rApache are for Linux only and this is probably true probably Concerto too. Instead Rserve runs on Windows and, while there is no official JS client, I found the third party rserve-js and also a node.js client.
Rook, by the same author of rApache, should be platform agnostic (isn't it?).
R Server Pages could work, but I didn't find examples on the functions HttpDaemon and HttpRequest in the vignette or reference manual.
I run some simple examples with gWidgetsWWW. It works, but it seems to produce canned web pages, without the possibility of modifying the HTML code.
EDIT
Let me clarify my question. I am not surveying your personal preferences.
The technologies or products mentioned here tend to be very young and not widespread. It would be very unpleasant to discover, after investing months of code, that they are not yet ready or don’t fit production. So I would like to know (not your subjective tastes, but) if they are able to work in the environment described above.
We have created a kind of webapp building on rApache and Ruby on Rails beside some other technologies at rapporter.net - that turned out to act rather a framework to host R-based statistical applications in the means of Rapplications instead of our initial goal to create a user-friendly online front-end to R. I'd warmly suggest to check out our features, as you might save tons of resources by not dealing with server-side, CMS and other boring issues, but could concentrate on the statistical tool.
Anyway, beside promoting our stuff, let me summarize my experiences:
rApache is definitely ready for production, but please note that only for rather stateless algorithms (by default Apache would start bunch of workers so the same user/client would end up interacting with different R sessions in each query). E.g. RServe would be a better alternative for a stateful application.
AFAIK Shiny server is meant to host dedicated statistical tools and applications -- just like our Rapplication service with or without DB backend -- with some customizable user input. You would need some technical skills to do so and also providing a (HA) environment might require way too much extra resources. This can be a huge advantage or disadvantage based on your requirements and expectations.
The hugest issue in such question should be security (like using RAppArmor or sandboxR), not just the R connector back-end, as users will interact with your servers (if hosted in the cloud). Desktop applications are a bit more developer-friendly, but supporting all major platforms can be a no-no in the era of tablets and smartphones. The cloud app can run on any device with a browser.
You should choose the optimal solution based on your requirements. There are bunch of tools ready for production, and each has its own advantages and special use cases. Just check out which related packages/applications are still under development with support, and answer a few questions like:
Is there any need to connect to databases?
What types of user input is needed (e.g. only parameters, datasets, R commands)?
Desktop/cloud app? Are you sure? If the later, would you like to care with the setup, maintenance and support?
Do you run any computational intensive tasks?
Do you want to implement an application to help users with repetitive and standardized tasks or rather providing a rather general and extensible software?
Do you need a responsive application with interactive setup or using that for reporting purposes?
What output formats do you need?
What other technologies are you familiar with? It's rather hard to built a Meteor-based app with a NoSQL backend if you worked with MySQL, PHP, Java or C# in the past.
I'm about to do something similar. The fastest way ( in respect of both deployment time and future application performance) seems to be c# interfacing with R through R.NET. In Visual Studio you will benefit from incredible visualisation options with just few clicks, and setting up interactive/drill down charts it's quite straightforward too. As you also mentioned, RServe (with Java) is another valuable option.
EDIT
If a web application is not required to run on a public ip address the r package Rook it's an interesting option. Some example with Rook: using ggplot2, using googleVis

working on a remote R session

The R session that I am working on is on a remote cluster due to memory contraints and the data is stored remotely. I am therefore using notepad++ to edit my files and just paste them into my SSH session as I go along. What is the best way to integrate with the remote session to take advantage of code completion and other things available in editors like RStudio. Any best practice suggestions about working on remote connections? I imagine this must be the case for most R users who work with large data sets.
From: http://www.sciviews.org/_rgui/projects/Editors.html
The famous Vim editor now also provides syntax highlighting for R. You can get a plugin to integrate R here. For Windows, there is another plugin using R-DCOM here. There is an alternate, multiplatform, plugin here. Also look at the VIM web site, because there are other interesting resources there for R users (see here for an overview). There is also an R package to better integrate debugging facilites in Vim: adtdbg

What to include when teaching a UNIX course? [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 13 years ago.
Improve this question
I was asked to teach UNIX to a group of people in my company who probably don't know much about UNIX with total Windows background. Help me decide the Course Contents. I don't want to go in and teach them a set of commands. I want it to be more on the lines of the UNIX architecture, file system, pipes, how everything are files, process creation and handling, virtual memory management etc. What do you all think? Help me in covering these topics.
You must read The UNIX Philosophy by Mike Gancarz. It might be worthwhile using as a text, but it will definitely give you a lot of the reasons why UNIX is good and how to leverage it's power best.
Unix topics in order of importance:
Pipes
Tool philosophy (do one thing well)
The permissions model
Shell syntax
Interacting with processes
Picking and using an editor
Basic C programming
An ideal way for Windows folks to learn how to function in a Unix environment is to have them use Cygwin on their Windows box.
Both Unix and Windows share most of their basic OS concepts: file descriptors, processes, virtual memory, etc... The only main difference you will need to address immediately is the different path tree structure: single root plus mount points vs drive letters.
I think you have to distinguish between several widely different topics:
using the shell:
You will need to get into concepts like process structure, file descriptors, basic commands.
programming under Unix:
You will need to address IDEs, compiling tools, building tools, and dynamic linking.
using the Unix desktop:
Modern Unices all have fairly comprehensive desktop environments, that work in a pretty similar way to Window... no big learning curve there.
You should include information about the shell. Explain the standard old method of using the output of one command as the input for the next, using the pipe.
Also show how output redirection is powerful, and how error redirection works (&2>).
Have your "Students" install Cygwin on their workstations to give them the opportunity to run "Unix-Commands" right inside Windows.
Underlying theory is always good, mention why UNIX is designed the way it is Eric Raymond's The Art of UNIX programming is good for that.
If they're going to be developing for UNIX some of the standards wont be amiss, Filesysetm Hierarchy standard and POSIX for example.
Sounds to me like you want to take a basic OS course, and make it UNIX specific. If you're designing the course for developer types, I'd think that would work well - they'd be familar with basic OS constructs and would appreciate knowing the UNIX specific flavors and then the commands that interact with each construct.
If you're designing the course for regular people, though, they might get lost in the OS theory. Even with a simple OS example, the whole thing gets very complicated.
My favorite UNIX book of all time is "A Student's Guide to UNIX". I'm sure there's many great competitor's out there. But what I liked was that it combined commands with basic theory and bundled each section with a bit of history on why given parts of the OS were designed a certain way and/or a bit of history on who the designers were. So much of UNIX is the commands, it was nice to have all those little blurbs and they were often nice memory joggers.
I'd start with fundamentals and compare each concept to its Windows counterpart. Kernel, driver, memory, process, daemon, file, user, a shell (vs. the command prompt), a filesystem etc.
let them run some UNIX-like system e.g. from live-cd etc (ubuntu or knoppix maybe some other live UNIX systems as well)
If they are power windows users, compare bash to powershell.
Most of windows users also don't get the cncept of init scripts vs. Windows services so I would explain that as well.
General directory structure.
Sockets and other various IPCs. Unix lets you treat them as files, which makes programming easier.
pthread library and concurrency concepts.
I would go through the Linux Administration Handbook and look at the chapters in the book and focus on those concepts that are important to a user as opposed to an administrator.
In addition to all of the other great suggestions, I would recommend discussing regular expressions in detail with examples in sed, awk, perl, vi, etc. REs are used in so many places, they really deserve their own place in the discussion. Add in a discussion of the common text processing utilities - cut, paste, grep, etc.

Resources