Unable to get R script to execute second part of code to download a csv - r

I'm trying to create a script in R that will:
Check if a folder exists and if not, create it
Download a csv file from the internet and place it in the specified folder
The script executes Part 1 fine and creates the folder, however it won't execute Part 2 and download the file automatically. I have to do this manually in RStudio and it works, but I want it to run automatically
Created a script with just Part 2 and it executes fine.
#Create directory first
if (!file.exists("data")) {
dir.create("data")
}
#Download file
fileUrl <-"https://d396qusza40orc.cloudfront.net/getdata%2Fdata%2Fss06hid.csv"
download.file(fileUrl, destfile = "./data/cameras.csv")
}
There are no error messages, it just executes Part 1 but does not execute Part 2.
Can someone advise what I'm missing here? TIA

Thanks it is working now. It worked in R console but not Studio but worked after restarting R Studio. Thanks all for the input

Related

R project WD changes "on its own"

I'm working on a project involving several people. We are using GitHub and a Rproject to be able to work on this project on different computers.
But I'm facing a really weird issue when I try to load files and/or get my working directory:
When I type getwd() in R console, I indeed get the path where the project is saved on my computer
> getwd()
[1] "/media/Data/Documents/my_R_project"
When I save getwd() in an object, the WD is not the same and now moves to the path were the script are saved:
folderpath <- getwd()
> folderpath
[1] "/media/Data/Documents/my_R_project/R/Script"
I get the same issue when I try to load a file which is located in /media/Data/Documents/my_R_project/R/Data: when I use read_csv (or any other function like it) and write the file path as
read_csv("R/Data/file.csv") I get and error stating me that there is no such file in the directory /media/Data/Documents/my_R_project/R/Script/R/Data/.
How could I resolve this and make the WD used in read_csv() be the right one ("/media/Data/Documents/my_R_project") so that I don't have to specify the full path every time and that people on another computer can run my script?
I'm working on Ubuntu LTS 20.04
In the end, the error was due to a conflict between the WD of the project and the WD used when I knitted the .Rmd file. The problem was solved by changing the knit options and set its directory as the project directory.

R studio will not remember my file path for a code chunk

When I run the following code I get an error.
library(tidyverse)
day <- read.csv("day.csv")[, -1]
glimpse(day)
Result:
Show in New Window
Error in file(file, "rt") : cannot open the connection
However when I specify the file path it works:
library(tidyverse)
setwd("~/UofUProjects/IS6489")
day <- read.csv("day.csv")[, -1]
glimpse(day)
I don't want to have to specify the file path for every single code chunk I run. Is this a setting I may have accidentally disabled? I tried deleting an re adding the file. I also tried clicking more and choosing set working directory and I still get the same error. I also tried updating Rstudio and R to the newest versions.
here is my working directory
The R markdown file I was using was saved in my download folder instead of my project folder. I was trying to specify the working directory folder with the R studio settings, and thought it would recognize however it did not work unless I actually saved the R markdown file in the project folder or used setwd(). I ran the code without the setwd() after I saved in the correct location and it worked just the way I wanted it to.

How to specify where cronR saves output file

I have successfully run a simple cronR tutorial using the cronR add in within R Studio. Here is the following code that I have saved in a R script file:
library(glue)
current_time <- Sys.time()
print(current_time)
msg <- glue::glue("This is a test I am running at {current_time}.")
cat(msg, file = "test.txt")
The R script file is saved within a specific project directory. The log file when the job runs is also saved there. However, the output of the script file, test.txt, is saved in my home directory. I am on a Mac. In the tutorial, it was stated that this would happen, that cron would save any output in the home directory and that if I want to change the location that I have to "specify otherwise". However, the tutorial gives no instructions for how to do this and I am not sure if I am supposed to do this through the terminal in mac and if so how? Changing the file path in the script file (e.g. Documents/test.txt) changes nothing, as the test.txt file is still saved in the home drive. I suspect I have to make this change somewhere else but I am not sure where. Any help would be appreciated.
For anyone who runs into the same issue, I was able to solve my problem by using launchD. I followed this tutorial https://babichmorrowc.github.io/post/launchd-jobs/. It worked as anticipated and now my .txt file is saved to the correct place. There is also a tutorial there for cron jobs, though if you are on a mac, launchD is apparently the preferred method.

Same R history from different workspace

I am new to R and I just figured out why my history did not contain all my previous commands. R create a .Rhistory file in each working directory.
I often change working directory and I would like to have the history of all my past sessions in the same file. Is there a simple way to do that ?
Thanks.
(I am on Mac OS 10.6 and I use Rstudio)
An easy way would be to manually save your history like this:
savehistory(file = "~/.Rhistory")
and then load it when you open an R command session:
loadhistory(file = "~/.Rhistory")
Otherwise you can edit your 'Rprofile.site' and add savehistory() and loadhistory() to the functions .Last and .First respectively.
More info about Rprofile.site: here
At startup, R will source the Rprofile.site file. It will then look for a .Rprofile file to source in the current working directory. If it doesn't find it, it will look for one in the user's home directory. There are two special functions you can place in these files. .First( ) will be run at the start of the R session and .Last( ) will be run at the end of the session.

Batch R Script - setting the working directory and selecting output folder

I've been digging into several places for 2 simple needs but couldn't find a final answer.
I'm running an R script in batch mode. Not sure whether my solution is the best one but I'm using R CMD BATCH as per http://stat.ethz.ch/R-manual/R-patched/library/utils/html/BATCH.html included in a bat file.
First I'd like to have the directory where the R script is saved set up as the working directory rather than where the bat file is saved.
Secondly I'd like to divert all the output from the R script (csv files and charts) to a specific directory other than the working directory. I cannot find any options for such basic requirement.
The final idea is to be able to run the bat file across different computers no matter where the R script is saved.
Thanks
You don't give code so my answer will be just an advise or what I would do for such job.
Use Rscript.exe it is the way to go for batch script. R CMD is a sort of legacy tool.
You don't need to set or change the working directory. It is a source of problems
You can launch you bat file where you want and within it you go to R script location using cd for example you bat file can be like:
cd R_SCRIPT_PATH
Rscript youscript.R arg1 arg2
You can use one of the script argument as an output directory for your result files. For example inside your script you do somehing like this:
args <- commandArgs(trailingOnly = TRUE)
resultpath <- as.character(args[1])
.....
write.table(res1, file=paste(resultpath,'res1.csv',sep='/')

Resources