I have a folder and that contains a lot of R files, that files are functions actually. What I need is to create code in another project in R which will load each file in that folder and load this functions in to the environment.
I know that better option is to create a R package from this functions but it canĀ“t be done for several reasons in my case.
What is the simplest way to achieve my goal?
Related
I have recently created a shiny application project with R studio, with sub-directories for inputs, outputs, packages and code. At the beginning of the code, I do .libPaths(myProject\packages) and then, proceed with the code, which reads and writes packages from that folder. It seem to me to be very intuitive to do this, because I need not disturb the native package installations, and I have all the packages required in a single folder. My question is whether this folder is portable. That is, if I take this folder as it is and put it in a different directory or a different system with their own native R installation, and just point to this folder as library path, will it work without issues? I have already read the solution with R-Portable, but I do not wish to use that if I can help it, because that would mean just using up more space than required if I work with multiple projects.
I have created a script with a few custom functions and have saved the file in one of the folders.
However, if i want to access the functions using source("Functions.R") i have to have the "Functions.R" file in my working directory. Is there any way of fetching the file without copying it to my current working directory? (I do not want to create a package for it)
Thanks!
I've been using http://r-pkgs.had.co.nz as a guide with much success, but I'm stuck on something that I haven't been able to resolve for many hours. Perhaps because I'm a bit thick...
Basically, I want to include a bunch of csv's as data.frames into an R package I'm working on. If I manually save the data objects as .rda or .Rdata and place them in the <package_dir>/data folder, I can build the package and the data is accessible upon loading the package. However, these csv's will receive updates every so often, and when this happens I'd like to just hit 'build' in R-Studio and have the data objects rebuilt from the csv's using some very simple processing scripts I have written.
So based on Hadley's guide, I've run devtools::use_data_raw() and put the csv's in the <package_dir>/data-raw folder. In that folder I've also placed R scripts to turn these data files in to R objects and then save them to the correct location and with the correct format with devtools::use_data(<object_name>).
My original interpretation was that that when building the package, the scripts in <package_dir>/data-raw get run to produce the .rda files in the <package_dir>/data folder. I'm guessing this is incorrect? If this is wrong, is there a way to automatically source those scripts when building the package? Is/would this be a bad practice?
I am currently working on a package that I want to bundle some large .rda files with (hundreds of MB). If I use devtools::load_all(), my package takes forever to load since I included the files in the /data/ dir.
Is there a way to tell R to ignore the files in /data/ until I manually load them with data(), or am I better of just putting my data into a different directory?
How about you
create a directory inst/optionalData/ (or another suitable name)
add functions to load these data sets on demand
as you can rely on
system.files("optionalDate", "nameOfFile.rds", package="yourPackage")
to find it.
I'm new to R and frankly the amount of documentation is overwhelming, and I haven't been able to find the answer to this question.
I have created a number of .R script files, all stored in a folder that I can access on my server (let's say the folder is, using the Windows backslash character \\servername\Paige\myscripts)
I know that in R you can call each script individually, for example (using the forward slash required in R)
source(file="//servername/Paige/myscripts/con_mdb.r")
and now this script, con_mdb, is available for use.
If I want to make all the scripts in this folder available at startup, how do I do this?
Briefly:
Use your ~/.Rprofile in the directory found via Sys.getenv("HOME") (or if that fails, in R's own Rprofile.site)
Loop over the contents of the directory via dir() or list.files().
Source each file.
as eg via this one liner
sapply(dir("//servername/Paige/myscripts/", "*.r"), source)
but the real story is that you should not do this. Create a package instead, and load that. Bazillion other questions here on how to build a package. Research it -- it is worth it.
Far the best way is to create a package! But as first step, you could also create one r script file (collection.r) in your script directory which includes all the scripts in a relative manner.
In your separate project scripts you can than include only that script with
source(file="//servername/Paige/myscripts/collection.r", chdir = TRUE)
which changes the directory before sourcing. Therefore you would have only to include one file for each project.
In the collection file you could use a loop over all files (except collection.r) or simply list them all.