I have a few R files that contain functions imported and used by several other R files. I import these functions with the source function. Naturally, the scope of a particular file might change over time, and recently I wanted to rename a file I had already sourced in many other places.
I'm using RStudio, and I have been unable to find a way to do this except for either manually updating each dependent file, or creating some external code to scan through the files.
Is there no way to do consistent renaming in RStudio? Alternatively, am I doing something wrong by using source to add functions?
You may or may not find this satisfactory. Create a parent script with the old name that sources the script with the new name.
Extending this, you could just create a general preamble script, called something like "preamble.R", that sources all general utility scripts you have. Such an approach is common (I believe) with TeX. Then you only have one place to update file names.
Related
I am trying to copy a file using Julia functions with the hope of manipulating the file and then use that copied version for various tasks in the Julia programming language. Can someone provide some example code of copying a file in Julia?
I guess I could do use read then write but it seems like I would be reinventing the wheel.
Is there a standard library function for this?
Inspired by this question.
Just use the built in function cp(src, dst).
Copy the file, link, or directory from src to dst. force=true will first remove an existing dst.
Afterwards you can open the file and manipulate it. Of course you could also open both source an destination files simultaneously and copy and manipulate it line by line.
I'm no R-programmer (because of the problem I started learning it), I'm using Python, In a forcasting task I got a dataset signalList.rdata of a pheomenen called partial discharge.
I tried some commands to load, open and view, Hardly got a glimps
my_data <- get(load('C:/Users/Zack-PC/Desktop/Study/Data Sets/pdCluster/signalList.Rdata'))
but, since i lack deep knowledge about R, I wanted to convert it into a csv file, or any type that I can deal with in python.
or, explore it and copy-paste manually.
so, i'm asking for any solution whether using R or Python or any tool to get what's in the .rdata file.
Have you managed to load the data successfully into your working environment?
If so, write.csv is the function you are looking for.
If not,
setwd("C:/Users/Zack-PC/Desktop/Study/Data Sets/pdCluster/")
signalList <- load("signalList.Rdata")
write.csv(signalList, "signalList.csv")
should do the trick.
If you would like to remove signalList from your working directory,
rm(signalList)
will accomplish this.
Note: changing your working directory isn't necessary, it just makes it easier to read in a comment I feel. You may also specify another path for saving your csv to within the second argument of write.csv.
This is an environment design question. I have a number of analysis/forecasting scripts I run each week, and each one relies on a number of files, with most files used by more than one script. I just had to change the name of one of the files, which was a real pain because I had to search through all my scripts and change the path declared in each one.
I would like to use a single .csv master file with file names and their paths, and create a centralized function that takes a list of file names, looks up their file paths, and then imports them all into the global environment. I could use this function in every script I run. Something like:
files_needed <- c("File_1", "File_2", "File_4", "File_6")
import_files(files_needed)
But then the function would require indirect variable assignment and declaring global variables, which know are frowned upon and I don't even know how to do both at once. I know I can write logic for importing the file path names manually in every script, but there must be a better option, where I can just write the import logic once.
Currently I have a master file that I source at the beginning of every script which loads my most commonly used packages and declares some helper functions I use frequently. I'd love to add this importing functionality in some capacity, but I'm open to solutions that look completely different to what I described. How do people generally solve this problem?
As a final note, many files have another twist, where they incorporate e.g. a date into the file name, so I need to be able to pass additional parameters in order to get the one I need.
Without a worked example this is untested code, but why not just make a list of imported files using those names?
files_needed <- c("File_1", "File_2", "File_4", "File_6")
my_imported_files <-
setNames( lapply(files_needed, read.csv), paste0(files_needed, "_df") )
While developing a package I encountered the problem of supplementary data import - this has been 'kind of' solved here.
Nevertheless, I need to make use of a function of another package, which needs a path to the used file. Sadly, using GlobalEnvironment variables here is not an option.
[By the way: the file needs to be .txt, while supplementary data should be .RData. The function is quite picky.]
So I need to know how to get the path supplementary data file of a package. Is this even possible to do?
I had the idea of reading the .RData into the global environment and then saving it into a tmpfile for further processing. I would really like to know a clean way - the supplementary data is ~100MB large...
Thank you very much!
Use system.file() to reliably find the path to the installed package and sub-directories, typically these are created in your-pkg-source/inst/extdata/your-file.txt and then referenced as
system.file(package="your-pkg", "extdata", "your-file.txt")
Situation
I wrote an R program which I split up into multiple R-files for the sake of keeping a good code structure.
There is a Main.R file which references all the other R-files with the 'source()' command, like this:
source(paste(getwd(), dirname1, 'otherfile1.R', sep="/"))
source(paste(getwd(), dirname3, 'otherfile2.R', sep="/"))
...
As you can see, the working directory needs to be set correctly in advance, otherwise, this could go wrong.
Now, if I want to share this R program with someone else, I have to pass all the R files and folders in relative order of each other for things to work. Hence my next question.
Question
Is there a way to replace all the 'source' commands with the actual R script code which it refers to? That way, I have a SINGLE R script file, which I can simply pass along without having to worry about setting the working directory.
I'm not looking for a solution which is an 'R package' (which by the way is one single directory, so I would lose my own directory structure). I simply wondering if there is an easy way to combine these self-referencing R files into one single file.
Thanks,
Ok I think you could use something like scaning all the files and then writting them again in the same new one. This can be done using readLines and sink:
sink("mynewRfile.R")
for(i in Nfiles){
current_file = readLines(filedir[i])
cat("\n\n#### Current file:",filedir[i],"\n\n")
cat(current_file, sep ="\n")
}
sink()
Here I have supposed all your file directories are in a vector filedir with length Nfiles, I guess you can adapt that