Is it possible to close an application that was launched from within R?
Assume that I have opened a CSV file my_file.csv with its associated application via the shell.exec function. I then want to close this application.
Since R has no control over other programs you cannot directly close files opended without R reliably. You do not even know which program to close. E.g. one one computer a csv file may be opened with notepad, on another computer it may be opened with Excel.
If you know the program you can use system2() or similar commands to execute a command to kill the other program. E.g. if you want to close Excel execute system2("taskkill", args = "/im excel.exe"). Note that this will close all open instances of the program/Excel, not jut a specific one.
Related
In a project, many separate scripts are executed and they read and write files from / for each other. It has become quite confusing, which file is coming from where. Of course, this is bad software design but that is how this has grown over a long time.
Now, I would to execute all scripts in their proper order and capture which files are read and which ones are written by each script.
Is there, e.g., a way to monitor and log the input and output of the R process while the script is running (from within R)? Or any other ideas for a solution?
I am running R under Windows 10.
I have an excel file that sits in a shared drive (MS One drive) and I would like to run an R script that updates some data in that file.
Is there a control in R to force close any open instances of that file so that the data update runs ok?
I have tried the close() and file() functions but without success.
Any ideas?
Thank you
I am trying to use an R script as a data source for Power BI. I am a regular user of R but am new to Power BI. When all the datasets that are imported by the R script are from SQL databases I can import the resulting dataframes from the R script fine, however I have a script that uses a .csv file that Power BI's R session can't find which results in the error:
Error: 'times_of_day_grid.csv' does not exist in current working directory ('C:/Users/MyUserName/RScriptWrapper_ac2d4ec7-a4f6-4977-8713-10494f4b0c4f').
The .pbix file and the R script are both stored in the same folder as the csv
I have tried manually setting the wd by inserting into the script
setwd("C:/Users/MyUserName/Documents/R/Projects/This Project Folder")
But this just results in the message
"Connecting - Please wait while we establish a connection to R"
And later if I leave it running:
Unable to connect
We encountered an error while trying to connect.
Details: "ADO.NET: R execution timeout. The script execution was
terminated, since it was running for more than 1800000 miliseconds."
I have also tried specifying the full addresses of the csv files in read_csv(), but this results in the same timeout warning.
Any ideas as to how I can edit my script (or the settings in Power BI) to get around this? (The script only takes a minute or so to run in RStudio.)
Don't forget that you can load your csv file using the built-in functionalities in PowerBI Get Data > Text/CSV and then go to Edit Queries and handle the R scripting from there. That way you won't have to worry about setting the working directory in the R script at all.
You can even load multiple files and work on each and everyone of them using the approach described in Operations on multiple tables / datasets with Edit Queries and R in Power BI
Please let me know how this works out for you-
I am facing difficulty in integrating R with Tableau.
Initially when I created calculated field it was asking for Rserve package in R and was not alowing to drag field to worksheet. I have installed this package but still it shows error saying
"Error occurred while communicating with the Resrve service.Tableau i unable to connect to the service.Verify that server is running and that you have access privileges"
Any inputs. Thank you
You need to start Rserve. If you successfully install Rserve package, simply run this (on RGui, RStudio or wherever you run R scripts)
> library(Rserve)
> Rserve()
You can test your connection to RServe on Tableau, on Help, Settings and Performance, Manage R Connection.
As of Tableau 9, you can use *.rdata files with Tableau. Tableau 9 will read the first item stored in the *.rdata file. Just open an *.rdata file under "Statistical Files" in the Tableau intro screen.
To do this do:
save(myDataframe, "Myfile.rdata")
This will save the file with the dataframe stored in it. You can save as many items as you want, but Tableau will only read the first. It can read vectors and variables as well if they are in the first item. Note that rdata files compress data quite a bit. I recently compressed 900mb to 25mb. However Tableau will still need to decompress it to use it so be careful about memory.
Is there a way to pass commands (from a shell) to an already running R-runtime/R-GUI, without copy and past.
So far I only know how to call R via shell with the -f or -e options, but in both cases a new R-Runtime will process the R-Script or R-Command I passed to it.
I rather would like to have an open R-Runtime waiting for commands passed to it via whatever connection is possible.
What you ask for cannot be done. R is single threaded and has a single REPL aka Read-eval-print loop which is, say, attached to a single input as e.g. the console in the GUI, or stdin if you pipe into R. But never two.
Unless you use something else as e.g. the most excellent Rserve which (when hosted on an OS other than Windoze) can handle multiple concurrent requests over tcp/ip. You may however have to write your custom connection. Examples for Java, C++ and R exist in the Rserve documentation.
You can use Rterm (under C:\Program Files\R\R-2.10.1\bin in Windows and R version 2.10.1). Or you can start R from the shell typing "R" (if the shell does not recognize the command you need to modify your path).
You could try simply saving the workspace from one session and manually loading it into the other one (or any kind of variation on this theme, like saving only the objects you share between the 2 sessions with saveRDS or similar). That would require some extra load and save commands but you could automatise this further by adding some lines in your .RProfile file that is executed at the beginning of every R session. Here is some more detailed information about R on startup. But I guess it all highly depends on what are you doing inside the R sessions. hth