Test if in a R module - r

I'm exploring R modules https://cran.r-project.org/web/packages/modules/vignettes/modulesInR.html and I wondered if there is a way to check that a function is being run inside a module (mind, not just defined but running inside the module).
My use case is that I have a script that sometimes I source as a normal R script and sometimes as a module with modules::use(). I would like to wrap the package loading in this script to use library if called normally and modules::import() if run into the module.

Found a way in how modules override the library function:
environmentName(environment()) == "modules:root"
to be called in the environment where a function is defined, not in a function itself.

Related

R testthat: use external package only in test file - not in DESCRIPTION

I'm trying to run a testthat script using GitHub Actions.
I would like to test a functionality of my function that allows it to be combined with (many) external packages. Now I want to test these external packages for the R CMD Check but I don't want to load the external packages generally (i.e. putting them into the Description) - after all, most people will not use these external packages.
Any ideas how to just include an external package in the testing files but not in the DESCRIPTION?
Thanks!
I think you describe a very standard use of Suggests.
I see two related but separable issues:
You want to test something using CI, in this case GHA. That is fine. Because you control the execution of the code, you could move your code from the test runner to, say, inst/examples and call it explicitly. That way the standard check of 'is the package using undeclared code' passes as inst/examples is not checked
You want to not force other people to have to load these packages. That is fine too, and we have Suggests: for this! Read Section 1.1 of Writing R Extensions about all the detailed semantics. If your package invokes other packages via tests, the every R CMD check touches this (and the external packages) so they must be declared. But you already know that only "some" people will want to use this "some of the time": that is precisely what Suggests: does, and you bracket the use with if (requireNamespace(pkgHere, quietly=TRUE)).
You can go either way, or even combine both. But you cannot call packages from tests and not declare them.

How can I create a library in julia?

I need to know how to create a library in Julia and where I must keep it in order to call it later. I come from C and matlab, it seems there is no documentation about pratical programming in Julia.
Thanks
If you are new to Julia, you will find it helpful to realize that Julia has two mechanisms for loading code. Stating you "need to know how to create a library in Julia" would imply you most likely will want to create a Julia module docs and possibly a packagedocs. But the first method listed below may also be useful to you.
The two methods to load code in Julia are:
1. Code inclusion via the include("file_path_relative_to_call_or_pwd.jl")docs
The expression include("source.jl") causes the contents of the file source.jl to be evaluated in the global scope of the module where the include call occurs.
Regarding where the "source.jl" file is searched for:
The included path, source.jl, is interpreted relative to the file where the include call occurs. This makes it simple to relocate a subtree of source files. In the REPL, included paths are interpreted relative to the current working directory, pwd().
Including a file is an easy way to pull code from one file into another one. However, the variables, functions, etc. defined in the included file become part of the current namespace. On the other hand, a module provides its own distinct namespace.
2. Package loading via import X or using Xdocs
The import mechanism allows you to load a package—i.e. an independent, reusable collection of Julia code, wrapped in a module—and makes the resulting module available by the name X inside of the importing module.
Regarding the difference between these two methods of code loading:
Code inclusion is quite straightforward: it simply parses and evaluates a source file in the context of the caller. Package loading is built on top of code inclusion and is quite a bit more complex.
Regarding where Julia searches for module files, see docs summary:
The global variable LOAD_PATH contains the directories Julia searches for modules when calling require. It can be extended using push!:
push!(LOAD_PATH, "/Path/To/My/Module/")
Putting this statement in the file ~/.julia/config/startup.jl will extend LOAD_PATH on every Julia startup. Alternatively, the module load path can be extended by defining the environment variable JULIA_LOAD_PATH.
For one of the simplest examples of a Julia module, see Example.jl
module Example
export hello, domath
hello(who::String) = "Hello, $who"
domath(x::Number) = x + 5
end
and for the Example package, see here.
Side Note There is also a planned (future) library capability similar to what you may have used with other languages. See docs:
Library (future work): a compiled binary dependency (not written in Julia) packaged to be used by a Julia project. These are currently typically built in- place by a deps/build.jl script in a project’s source tree, but in the future we plan to make libraries first-class entities directly installed and upgraded by the package manager.

How to call R script from another R script, both in same package?

I'm building a package that uses two main functions. One of the functions model.R requires a special type of simulation sim.R and a way to set up the results in a table table.R
In a sharable package, how do I call both the sim.R and table.R files from within model.R? I've tried source("sim.R") and source("R/sim.R") but that call doesn't work from within the package. Any ideas?
Should I just copy and paste the codes from sim.R and table.R into the model.R script instead?
Edit:
I have all the scripts in the R directory, the DESCRIPTION and NAMESPACE files are all set. I just have multiple scripts in the R directory. ~R/ has premodel.R model.R sim.R and table.R. I need the model.R script to use both sim.R and table.R functions... located in the same directory in the package (e.g. ~R/).
To elaborate on joran's point, when you build a package you don't need to source functions.
For example, imagine I want to make a package named TEST. I will begin by generating a directory (i.e. folder) named TEST. Within TEST I will create another folder name R, in that folder I will include all R script(s) containing the different functions in the package.
At a minimum you need to also include a DESCRIPTION and NAMESPACE file. A man (for help files) and tests (for unit tests) are also nice to include.
Making a package is pretty easy. Here is a blog with a straightforward introduction: http://hilaryparker.com/2014/04/29/writing-an-r-package-from-scratch/
As others have pointed out you don't have to source R files in a package. The package loading mechanism will take care of losing the namespace and making all exported functions available. So usually you don't have to worry about any of this.
There are exceptions however. If you have multiple files with R code situations can arise where the order in which these files are processed matters. Often it doesn't matter or the default order used by R happens to be fine. If you find that there are some dependencies within your package that aren't resolved properly you may be faced with a situation where a custom processing order for the R files is required. The DESCRIPTION file offers the optional Collate field for this purpose. Simply list all your R files in the order they should be processed to satisfy the dependencies.
If all your files are in R directory, any function will be in memory after you do a package build or Load_All.
You may have issues if you have code in files that is not in a function tho.
R loads files in alphabetical order.
Usually, this is not a problem, because functions are evaluated when they are called for execution, not at loading time (id. a function can refer another function not yet defined, even in the same file).
But, if you have code outside a function in model.R, this code will be executed immediately at time of file loading, and your package build will fail usually with a
ERROR: lazy loading failed for package 'yourPackageName'
If this is the case, wrap the sparse code of model.R into a function so you can call it later, when the package has fully loaded, external library too.
If this piece of code is there for initialize some value, consider to use_data() to have R take care of load data into the environment for you.
If this piece of code is just interactive code written to test and implement the package itself, you should consider to put it elsewhere or wrap it to a function anyway.
if you really need that code to be executed at loading time or really have dependency to solve, then you must add the collate line into DESCRIPTION file, as already stated by Peter Humburg, to force R to load files order.
Roxygen2 can help you, put before your code
#' #include sim.R table.R
call roxygenize(), and collate line will be generate for you into the DESCRIPTION file.
But even doing that, external library you may depend are not yet loaded by the package, leading to failure again at build time.
In conclusion, you'd better don't leave code outside functions in a .R file if it's located inside a package.
Since you're building a package, the reason why you're having trouble accessing the other functions in your /R directory is because you need to first:
library(devtools)
document()
from within the working directory of your package. Now each function in your package should be accessible to any other function. Then, to finish up, do:
build()
install()
although it should be noted that a simple document() call will already be sufficient to solve your problem.
Make your functions global by defining them with <<- instead of <- and they will become available to any other script running in that environment.

Rcpp: Save compiled function as Robj

If I define a function in R, I can save the function object using the save function. Then I can load that function object using the load function and use it directly. However, if I have a rcpp function, and if I try to save the compiled version and load it back to the memory, I can no longer use that function object directly. Is this even possible? The reason I ask is because it takes a while to compile the function, and if there is a way to avoid that cost every time I launch an R environment, that will be great. Thanks!
No, in general you cannot serialize (and hence save) a function compiled with cxxfunction() or sourceCpp(). You need to freshly compile it, unless you place it in a package. Which is why packages are the way to go to really install your compiled code beyond quick experimentation.

R - Execute a function in a file

There is a R file and there is a function getInfo() in it.
I want to run this function in that script file alone.
Is that possible ?
I know running the script command on the file and then running the function name will help.
But then it will also run the rest of stuffs from the script file which i dont want.
What is the best way out here
When you use source on a script file, all the code in that file will be loaded into the R session currently active. Any code that is not in a function, will be executed. I see two options:
Put the function in a seperate source file, or even a package if the number of functions grows.
Set a global R variable using option and retrieve its value in the file to be sourced using getOption, making the execution of the non-function code dependend on this option. This does require you to always set this option before sourcing the file, in any project you use it in.
I would go for option 1.

Resources