I need help with passing an argument as a variable name in an R script from the terminal. I'll run the script as follows:
R < script.R --args "hello"
And, in the script there should be something like this:
args <- commandArgs(trailingOnly = TRUE)
assign(args[1],24)
save(args[1], file="output.RData")
But, I need to take the argument as the variable name. What I mean is the following: If I run the script with "numbers" argument, the variable name inside the script should be numbers.
assign(args[1], 24)
does the trick. But, inside the save function, args[1] does not work. How can I pass it as a variable name?
Does it work if you try
saveRDS(get(args[1]),file="output.rds")
?
You won't get a text file with the save function. If you want its text version you would need to use `dump". This would "work" despirte the extention. The file ois still an .Rdata file event without the extension:
arg=1
argname="reports"
assign(argname, arg)
reports
#[1] 1
save(reports, file="test.txt")
rm(reports)
rm(argname)
rm(arg)
load("test.txt")
To use dump:
dump('reports', file="test2.txt")
This would appear in that file. It should be parse-able (and readable to humans) R code:
reports <-
1
Related
I am trying to use
var <- as.numeric(readline(prompt="Enter a number: "))
and later use this in a calculation.
It works fine when running in RStudio but I need to be able to pass this input from the command line in Windows 10
I am using a batch file with a single line
Rscript.exe "C:\My Files\R_scripts\my_script.R"
When it gets to the user input part it freezes and it doesn't provide expected output.
From the documentation of readline():
This can only be used in an interactive session. [...] In non-interactive use the result is as if the response was RETURN and the value is "".
For non-interactive use - when calling R from the command line - I think you've got two options:
Use readLines(con = "stdin", n = 1) to read user input from the terminal.
Use commandArgs(trailingOnly = TRUE) to supply the input as an argument from the command line when calling the script instead.
Under is more information.
1. Using readLines()
readLines() looks very similar to readline() which you're using, but is meant to read files line by line. If we instead of a file points it to the standard input (con = "stdin") it will read user input from the terminal. We set n = 1 so that it stops reading from the command line when you press Enter (that is, it only read one line).
Example
Use readLines() in a R-script:
# some-r-file.R
# This is our prompt, since readLines doesn't provide one
cat("Please write something: ")
args <- readLines(con = "stdin", n = 1)
writeLines(args[[1]], "output.txt")
Call the script:
Rscript.exe "some-r-file.R"
It will now ask you for your input. Here is a screen capture from PowerShell, where I supplied "Any text!".
Then the output.txt will contain:
Any text!
2. UsingcommandArgs()
When calling an Rscript.exe from the terminal, you can add extra arguments. With commandArgs() you can capture these arguments and use them in your code.
Example:
Use commandArgs() in a R-script:
# some-r-file.R
args <- commandArgs(trailingOnly = TRUE)
writeLines(args[[1]], "output.txt")
Call the script:
Rscript.exe "some-r-file.R" "Any text!"
Then the output.txt will contain:
Any text!
I am trying to use
var <- as.numeric(readline(prompt="Enter a number: "))
and later use this in a calculation.
It works fine when running in RStudio but I need to be able to pass this input from the command line in Windows 10
I am using a batch file with a single line
Rscript.exe "C:\My Files\R_scripts\my_script.R"
When it gets to the user input part it freezes and it doesn't provide expected output.
From the documentation of readline():
This can only be used in an interactive session. [...] In non-interactive use the result is as if the response was RETURN and the value is "".
For non-interactive use - when calling R from the command line - I think you've got two options:
Use readLines(con = "stdin", n = 1) to read user input from the terminal.
Use commandArgs(trailingOnly = TRUE) to supply the input as an argument from the command line when calling the script instead.
Under is more information.
1. Using readLines()
readLines() looks very similar to readline() which you're using, but is meant to read files line by line. If we instead of a file points it to the standard input (con = "stdin") it will read user input from the terminal. We set n = 1 so that it stops reading from the command line when you press Enter (that is, it only read one line).
Example
Use readLines() in a R-script:
# some-r-file.R
# This is our prompt, since readLines doesn't provide one
cat("Please write something: ")
args <- readLines(con = "stdin", n = 1)
writeLines(args[[1]], "output.txt")
Call the script:
Rscript.exe "some-r-file.R"
It will now ask you for your input. Here is a screen capture from PowerShell, where I supplied "Any text!".
Then the output.txt will contain:
Any text!
2. UsingcommandArgs()
When calling an Rscript.exe from the terminal, you can add extra arguments. With commandArgs() you can capture these arguments and use them in your code.
Example:
Use commandArgs() in a R-script:
# some-r-file.R
args <- commandArgs(trailingOnly = TRUE)
writeLines(args[[1]], "output.txt")
Call the script:
Rscript.exe "some-r-file.R" "Any text!"
Then the output.txt will contain:
Any text!
This is likely a stupid question but I have not found a work around (at least in anything I have searched for, though I might just not be using the right search parameters.)
I want to call an executable in Windows, and send a file to it (in this case a Blaise man file), the name of which is variable in my script.
So, for example, I have
x<-2
myfile<-c(paste("FileNumber",x,".man", sep="")
system("myapp.exe" myfile)
But I simply get
Error: unexpected symbol in "system("myapp.exe" myfile"
as if the command is not recognizing the object as myfile, instead taking "myfile" as literal text.
I tried using a paste function to create a whole line command, but that also did not work.
The system command will not concatenate the string and the myfile object together, you have to do it yourself.
So, try this instead:
x<-2
myfile<-c(paste("FileNumber",x,".man", sep=""))
cmd <- paste("myapp.exe", myfile)
system(cmd)
Or just:
x<-2
system(paste("myapp.exe", c(paste("FileNumber",x,".man", sep=""))))
I am trying to use Sys.glob to open a file called "apcp_sfc_latlon_subset_19940101_20071231.nc". The following command works:
> Sys.glob(file.path("data/train", "apcp*"))
[1] "data/train/apcp_sfc_latlon_subset_19940101_20071231.nc"
But this command doesn't return anything. I'm don't know why it doesn't work.
> Sys.glob(file.path("data/train", "apcp", "*"))
character(0)
I want the "apcp" bit as it's own argument because I will be passing a variable instead of a hard coded string.
Thank you.
file.path("data/train", "apcp", "*") returns "data/train/apcp/*" whereas file.path("data/train", "apcp*") returns "data/train/apcp*".
Thus in the first case you are looking for files in the subdirectoy apcp, and in the (working) case you are looking for files which begin apcp within the data\train directory.
If you want to be able to pass the apcp component as a argument, using paste0 will work
starting <- "apcp"
file.path("data/train", paste0(starting, '*', collapse =''))
# "data/train/apcp*"
I have a number of R scripts that I would like to chain together using a UNIX-style pipeline. Each script would take as input a data frame and provide a data frame as output. For example, I am imagining something like this that would run in R's batch mode.
cat raw-input.Rds | step1.R | step2.R | step3.R | step4.R > result.Rds
Any thoughts on how this could be done?
Writing executable scripts is not the hard part, what is tricky is how to make the scripts read from files and/or pipes. I wrote a somewhat general function here: https://stackoverflow.com/a/15785789/1201032
Here is an example where the I/O takes the form of csv files:
Your step?.R files should look like this:
#!/usr/bin/Rscript
OpenRead <- function(arg) {
if (arg %in% c("-", "/dev/stdin")) {
file("stdin", open = "r")
} else if (grepl("^/dev/fd/", arg)) {
fifo(arg, open = "r")
} else {
file(arg, open = "r")
}
}
args <- commandArgs(TRUE)
file <- args[1]
fh.in <- OpenRead(file)
df.in <- read.csv(fh.in)
close(fh.in)
# do something
df.out <- df.in
# print output
write.csv(df.out, file = stdout(), row.names = FALSE, quote = FALSE)
and your csv input file should look like:
col1,col2
a,1
b,2
Now this should work:
cat in.csv | ./step1.R - | ./step2.R -
The - are annoying but necessary. Also make sure to run something like chmod +x ./step?.R to make your scripts executables. Finally, you could store them (and without extension) inside a directory that you add to your PATH, so you will be able to run it like this:
cat in.csv | step1 - | step2 -
Why on earth you want to cram your workflow into pipes when you have the whole R environment available is beyond me.
Make a main.r containing the following:
source("step1.r")
source("step2.r")
source("step3.r")
source("step4.r")
That's it. You don't have to convert the output of each step into a serialised format; instead you can just leave all your R objects (datasets, fitted models, predicted values, lattice/ggplot graphics, etc) as they are, ready for the next step to process. If memory is a problem, you can rm any unneeded objects at the end of each step; alternatively, each step can work with an environment which it deletes when done, first exporting any required objects to the global environment.
If modular code is desired, you can recast your workflow as follows. Encapsulate the work done by each file into one or more functions. Then call these functions in your main.r with the appropriate arguments.
source("step1.r") # defines step1_read_input, step1_f2
source("step2.r") # defines step2_f2
source("step3.r") # defines step3_f1, step3_f2, step3_f3
source("step4.r") # defines step4_write_output
step1_read_input(...)
step1_f2(...)
....
step4write_output(...)
You'll need to add a line at the top of each script to read in from stdin. Via this answer:
in_data <- readLines(file("stdin"),1)
You'll also need to write the output of each script to stdout().