This question already has answers here:
Run R script from command line
(7 answers)
Closed 2 years ago.
thanks for your time.
I have a more general question, related to a business use case.
I created an R script that takes an excel file, checks certain conditions, and then exports out another excel file.
I created this for a specific use case, and for other people in my organization on a certain team.
The other people in my organization would like to be able to run this R script on their own, without having to contact me every time they want to run it. They could be running it upwards of a few times a day across the entire team.
On my end, I do not want the team members to have to open up R each time they want to run the script. It doesn't seem very user friendly from their perspective, and I would prefer to keep the experience easy for them.
So here's my question: Is there any application I can find or create that the team members can use to run my R script, without having to use R explicitly?
I've done quite a bit of googling around. One solution I saw was to create an executable version of the file, but I believe that would still be tricky since that would involve customizing each of the team members computers.
I also thought that RShiny might be able to fill the gap? But I am not familiar with RShiny as of now, and do not know what exactly it can do.
Thanks for any other suggestions you may have.
There are mainly two ways. with using Rscript, like below:
C:\Users\automat7> Rscript app.r
or in some cases, like with shiny or when running a one line script, usually, you can use
R -e "shiny::runApp(address_to_folder, args)"
You may need to add the R's bin folder to your PATH environment variable if you are using Windows.
You can follow the instructions here for that: How to Add a folder to Path environment variable in Windows10
Related
I am in the process of automating a number of graphs that are produced where I work through R that are currently in Excel.
Note that for now, I am not able to convince that doing the graphs directly in R is the best solution, so the solution cannot be "use ggplot2", although I will push for it.
So in the meantime, my path is to download, update and tidy data in R, then export it to an existing Excel file where the graph is already constructed.
The way I have been trying to do that is through openxlsx, which seems to be the most frequent recommendation (for instance here).
However, I am encountering an issue taht I cannot solve with this way (I asked a question there that did not inspire a lot of answers !).
Therefore, I am going to try other ways, but I seem to mainly be directed to the aforementioned solution. What are the existing alternatives ?
I have a large population survey dataset for a project and the first step is to make exclusions and have a final dataset for analyses. To organize my work, I must continue my work in a new file where I derive survey variables correctly. Is there a command used to continue work by saving all the previous data and code to the new file?
I don´t think I understand the problem you have. You can always create multiple .R files and split the code among them as you wish, and you can also arrange those files as you see fit in the file system (group them in the same folder with informative names and comments, etc...).
As for the data side of the problem, you can load your data into R, make any changes / filters needed, and then save it to another file with one of the billions of functions to write stuff to the disk: write.table() from base, fwrite() from data.table (which can be MUCH faster), etc...
I feel that my answer is way too obvious. When you say "project" you mean "something I have to get done" or the actual projects that you can create in rstudio. If it´s the first, then I think I have covered it. If it´s the second, I never got to use that feature so I am not going to be able to help :(
Maybe you can elaborate a bit more.
I have multiple R scripts for different models and I need to make it easily accessible for other people to use. So I would like to have one script in which only contains sources to run the other scripts without people having to search through many files to find the right one. some of the scripts have more than one model in so if possible I would like to source only specific blocks of lines from those scripts.
For example to find the accuracy of ARIMA in different ways I have to run the following different scripts in turn;
Read data
Arima
Accuracy of in-sample
Accuracy out Read data
Accuracy of out forced param
Accuracy out sample
The amount of different scripts causes the risk of an error to be higher. especially as within 3 of those scripts is 5 other models which if running myself I would just highlight the specific model I'm wanting to use and run, but for other people that may be more confusing.
I know that I have to use source() to get the scripts to run but im stuck as to how to source only certain parts of the script and the correct way to source
Rather than trying to source parts of scripts, move these bits of code into functions, and then just call the functions you need.
Start by searching around for how to write R functions
You can put all your functions into a single file, source it, and then make your recipes of functions with orders for others.
You could make one code that automates the whole thing and then use knitr to create a word, or pdf document of the whole thing for other people to read easily?
I have just finished writing a set of R scripts. There is one master file and 5 additional external r files called by the source function. I was wondering whether it is possible to unify the five external scripts into one single file; dividing the entire code into five sections run into different moments.
Is it possible to do so by using source?. If not, which strategy do you suggest?
As #flodel said in the comment, write functions. There are a few previous answers that are worth checking out:
Workflow for statistical analysis and report writing
writing functions vs. line-by-line interpretation in an R workflow
How do you combine "Revision Control" with "Workflow" for R?
Currently, I create multiple copies of a script that each have a slightly different parameter values (3 different scripts). I run them from different folders. I'm wondering if I could just open three different terminals and run the same script. Of course, after starting each run at a particular parameter value, I would go back to the original script, change the parameter value, save the script and the run it again ...
I guess I'm not sure of all the steps that are preformed underneath the hood to run a script. Any thoughts would be appreciated.
No it will have it's own memory and in no way overlap.
The only problem will be if both programs try to write to the same file at the same time.