I have a device streaming data to the PC and I would like to use R to produce real-time streaming plots. I realize that Javascript is probably the best tool for this, but I don't have the skill set for reading data and plotting in Javascript.
I am aware of gganimate, gifski, etc. for animated gifs, but I don't think those will be able to stream data.
I have tried Shiny with invalidateLater and it works, but struggles to update faster than about 5 frames per second.
I have tried to run an R process to generate images on a timer and a simple html page that loads the image every 100 ms, but this produces frequent broken links when the html page is trying to load an image that R is actively writing.
Are there any other options in R that don't involve learning Javascript or Javascript packages? Any other general advice?
Related
I have a current project that consists of 3 parts:
An interface for clients to upload datasets from their equipment.
Process the uploaded datasets using R and some preset coefficients.
Display in a graph with a regression line, allowing the user to click points on the graph to remove them where needed and redraw the regression line automatically (after point removed).
Part 1: This is already done using PHP/Laravel. A simple upload and processing interface.
Part 3: I've been able to set this up in chart.js without any problems.
Part 2 is the sticking point at the moment. What I'd like is to be able to send the data to an rscript and just get the datapoints back so that I can display them. Can anyone give suggestions as to the best way to do this? Is there an api available? Or do I need to install software on the server (not an issue if I do, but I'm hoping to avoid the need to if possible)?
TIA
Carton
There is the package shiny to do everything in R (user side GUI and server side R processing). Since you did it in PHP already, you can either write an R script that is being executed with a shell call from PHP, or build an R REST API with plumber
I have a very large and complex R Shiny app and I am trying to find bottlenecks or parts of the code to improve via the profvis package.
The problem is that profvis itself is performing very slowly (because of the processes and functions in my shiny app) so that it's lagging and almost not possible to properly view & navigate through the profile or the flame graph.
Starting the profvis profile via R or via browser (firefox & chrome tested) doesn't really make a big difference here.
I am only testing one (the main) feature/calculation in my shiny app which is initiated by one action button, thus I can't really test less features or make the profile "shorter".
Any help or tips are appreciated. Especially ways to run profvis faster. Another option I tried was to only wrap parts of my code inside the shiny app with profvis, but I didn't find a way to get this work.
Thank you!
We are working on a project where following an analysis of data using R, we use rmarkdown to render an html report which will be returned to users uploading the original dataset. This will be part of an online complex system involving multiple steps. One of the requirements is that the rmarkdown html will be serialized and saved in a SQL database for the system to return to users.
My question is - is there a way to render the markdown directly to an object in R to allow for direct serialisation? We want to avoid saving to disk unless absolutely needed as there will be multiple parallel processes doing similar tasks and resources might be limited. From my reasearch so far it doesn't seem possible, but would appreciate any insight.
You are correct, it's not possible due to the architecture of rmarkdown.
If you have this level of control over your server, you could create a RAM disk, using part of your the memory to simulate a hard drive. The actual hard disk won't be used.
I work for a company that runs market research surveys for clients. Hundreds of different clients participate in each survey, so they do not all get an individual report. Instead, we have a report template in PowerPoint which is the same for all clients, and then we use a program called E-Tabs Enterprise to populate the template for each client with that client's own survey results.
These reports are typically about 100 slides long, and contain a mixture of static text and images, which are the same for all clients, and survey results, which vary between different clients. The reports are sent as a PowerPoint or as a PDF to clients via an FTP.
To increase efficiency, I want to switch either to Python Plotly Dash or RStudio Shiny to create static PDF reports (although I will also be interested in making dashboards in the future). I am trying to see which, if either of these, has the capabilities I need. I am already competent at Python Pandas.
It is clear from the Shiny website (link here: https://shiny.rstudio.com/articles/) that Shiny can make a report and export it as a static PDF. However, I have two questions:
Can the open-source version of Plotly Dash also be used to create a report and export it as a PDF?
Is it possible, within (open-source) Shiny or Dash, for the code to loop through the data for the different clients and export each as a separate PDF?
If this is not possible, could you please tell me what the limitations are and whether it is possible in the paid versions of the two programs?
Please let me know if my question is in any way unclear. I would also be open to recommendations for other software if there is something out there more suitable for what I need. Thank you for your help in advance.
Kind regards
Thank you for your answers. I can see that RMarkdown has the capabilities I need. It seems that there is no good equivalent of RMarkdown in Python at present, unfortunately, so RMarkdown is the right tool to use.
I need to visualize a very large graph ( > 1 million ) in a website, the library should to receive a JSON and generate a png.
I've already tried graphviz with spfd but the waiting is too long, the user will not be able to interact with a GUI so Cytoscape or Gephi are not an option.
Another option could be maybe exporting a .dot to web, but I don't know how efficient this would be.
http://www.gnuplot.info/ and http://dygraphs.com/ are two of my favorite charting libraries for the web. Plotting a million data points I would probably use gnuplot. I would look at caching generated images as much as possible, perhaps even generating them in a batch rather than at the time of the request. I think you will run into performance issues trying to pass the data in via JSON and attempting to render the graph client side.