Temporarily loading and unloading packages in an R function - r

I am writing a function that will take the name of an installed package and return a data frame listing all the data frames available in that package along with the number and types of variables in those data frames.
In order to do this, I need to require the package temporarily so I can access its data sets. The problem I have is that requiring a package also introduces a whole lot of extra stuff into the search path and the loaded namespaces beyond just the package in question. I want my function to tidy up after itself, but I can't find a good way to detach everything that got imported when the package was required. In particular, detach seems to detach only the package, but not any of the other imported stuff.
Any advice?

I'm not sure what IDE you are workign with but many of them have "tab-completion". If I type :....... ?unload at my console and hit <tab> I immmediately see ??unloadNamespace ... so that would be a reasonable function to investigate. You should first look at:
?unloadNamespace
... and then decide if that is sufficient. There is also the detach function that has a link to its associated help page in that help page.

Related

can't find a function from loaded package

I created a local package with personal functions to be easily used within R. One of these is aimed to be used in the lidR package within a wrapper function (i.e. grid_metrics). For this reason I took the scheme of this script as a reference, exporting both the long name (e.g. my_metrics(param1, param2,...)) and the lazy one (e.g. .my_metrics), because I really like its ease of use.
Nevertheless, if I load my package and then call the lazy function
library(mypackage)
test = grid_metrics(las, .my_metrics, 20)
it does not work, so I have to load in memory the function by running its code from the file. At this stage, I can use it in both forms.
Within the NAMESPACE file I can see that both forms are exported so my last guess is that this might be related somehow to lazyeval but I don't get how.
It seems that the problem is related to the DESCRIPTION section in which the lidR package was included. Since when I moved from Imports to Depends the issue is solved.

Automatic loading of data from sysdata.rda in package

I have spent a lot of time searching for an answer to what is probably a very basic question, but I just can't find the solution to my issue. The closest that I found was this exchange from a few years ago.
In that case, the issue was the location of the sysdata.rda file in the correct directory within the package. That is not my issue.
I have some variables that store things like color palettes that I amusing inside a package. These variables are only used inside my functions so I storing them in R/sysdata.rda. However, when I load the packages, the variables are not loading into the package environment. If I load the data manually from sysdata.rda then everything works fine.
My impression from reading everything that I could find on internal data in R packages was that the data in R/sysdata.rda would load automatically.
Here is the code that I am using to store my data.
devtools::use_data(tmpBrks, tmpColors, prcpBrks, prcpChgBrks,
prcpChgBrkLabels, prcpColors, prcpChgColors,
internal = TRUE, overwrite = TRUE)
That successfully creates the data file at R/sysdata.rda and the data is in the file when I load it manually.
What do I need to do to have the data load automatically so the functions in my package can use them?
As usual, this was a bad combination of user ignorance and poor R documentation. The data was being loaded and was available to the functions. Where I went wrong was in assuming that the data would be visible in the package environment. That is not the case.
As far as I can tell, internal data in the R\sysdata.rda file is available to the functions within the package, but not visible in any way. After I created the internal data file I was looking for the data in the package environment. When I didn't see it I assumed that it wasn't loaded. When I kept pushing forward with my package development I finally realized that the data was loading silently and accessible to the functions in the package.
As evidenced by the two up votes that my question got, I am not the only one who didn't understand the behavior of the R\sysdata.rda internal data. Hopefully this explanation will save someone else a bunch of time searching for an answer to this issue that doesn't really exist.

Load data object when package is loaded

Is there a way to automatically load a data object from a package in memory when the package is loaded (but not yet attached)? I.e. the opposite of lazy loading? The object is used in one of the package functions, so it needs to be available at all time.
When the package is set to lazydata=false, the data object is not exported by the package at all, and needs to be loaded manually with data(). We could use something like:
.onLoad <- function(lib, pkg){
data(mydata, package = pkg)
}
However, data() loads the object in the global environment. I strongly prefer to load it in the package environment (which is what lazydata does) to prevent masking conflicts.
A workaround is to bypass the data mechanics completely, and simply hardcode the object in the package. So the package myscore.R would look like
mymodel <- readRDS("inst/mymodel.rds")
myscore <- function(newdata){
predict(mymodel, newdata)
}
But this will lead to a huge packagedb for large data objects, and I am not sure what are the consequences of that.
As you say
The object is used in one of the package functions, so it needs to be available at all time.
I think the author of that package should really NOT use data(.) for that.
Instead he should define the object inside his /R/ either by simple R code in an R/*.R file,
or by using the sysdata.rda approach that is explained in the famous first reference for all these question,
"Writing R Extensions". In both cases the package author can also export the object which is often desirable for other users as in your case.
Of course this needs a polite conversation between you and the package author, and will only apply to the next version of that package.
I'm going to post this since it seems to work for my use case.
.onLoad() is:
function(lib,pkg)
data(mydata, package=pkg,
environment=parent.env(environment()))
Also need Imports: utils in DESCRIPTION and importFrom(utils, data) in NAMESPACE in order to pass R CMD check.
In my case I don't need the data object to be visible to the user, I need it to be visible to one of the functions in the package. If you need it visible to the user, that's going to be even harder (I think) because as far as I can tell you can't export data, just functions. The only way I've thought of to export data is to export a wrapper function for the data.

R package namespace issue using data() -- data set not found

I've hit an issue trying to import a package (namely, 'robfilter') inside one of my own packages. One of its methods that I am trying to use, adore.filter, is failing at this line:
data(critvals)
With error 'data set 'critvals' not found'.
The function works fine if I load the library via require(robfilter). However, this means that in order to use my custom package which calls adore.filter, I will have to load my own package, and then load robfilter. Not a huge problem but slightly annoying.
I'm not sure if the problem is that there is an extra step I need to do in order to make critvals visible within my package, or if perhaps there is something the package author needed to do (and hasn't done) to add critvals to its package namespace; there is no sign of 'critvals' in the robfilter NAMESPACE file. I haven't encountered this issue before and don't really understand how the use of data() inside a package is supposed to work.
There are two solutions as far as I know:
Either ask the robfilter Maintainer to put the data needed by robfiler in the internal data file of robfilter. (R/sysdata.rda)
Or make your package Depends on robfilter
So it works if you put robfilter in the depends section of your description file. But in my case (both are my packages), I was trying to avoid the Depends solution as it loads the imported package and also any other package will need to depend ont its imported package... See my question is quite a duplicate of yours but not in the same context.

Can I load an RData file while bypassing loading the namespaces?

Let's say some of my users cannot alter their R environments, but I need them to be able to open up RData files. These environment files require a package to be loaded (httpuv to be exact). We don't care about the package, we don't need its capabilities, we just need to get at the data. Is there a way to either force R to bypass loading namespaces when loading the RData file, or force it to save it without namespace dependencies at the originating end? Thanks.
To reproduce, install Shiny. Create and save a some R objects to the server's file system from within a Shiny applet as an RData file. Copy the file over to a computer that doesn't have Shiny or the httpuv package installed. Try loading the RData file, even if the actual objects you saved are completely ordinary data.frames that have nothing to do with Shiny or httpuv.
I did strings on the RData, and the damn thing is full of references to httpuv. The software is loading the file and then actively deciding to not continue in the internal loadFromConn2() function. Therefore there must be a way to make it stop doing so.
Really #baptiste should get credit for the link in his comment to some general solutions, especially the R CMD INSTALL --fake trick, and I will accept that if he reposts it as an answer. That is why I am not accepting the following answer of my own to the specific problem that caused it in my case, but I am posting my answer in case it helps someone else.
Some of the objects I was saving were lm fitted objects. Those contain formula/terms objects (at least two each, for some reason... maybe because they've been through stepAIC), and those formulas in turn each have an environment attribute. The environment attribute is .GlobalEnv which probably does contain copies of package functions someplace. When I dug through the objects inside the fitted models, and then the objects inside all the attributes of those objects, and then the objects inside the attributes of the attributes of those objects... and set every environment attribute I could find to NULL, eventually I was able to save that fitted model to a file that could be opened from a different R installation without getting the error about not being able to load a namespace.
I suppose I could also write a function to iterate through the objects within a fitted model, and their attributes, and remove environments but that sounds ugly and dangerous. Maybe there is a way to force formulas and fitted models not to retain environments, and that will be better. For the time being, instead of saving fitted models, I will save their call attributes after scrubbing any environment attributes I might find there. If that doesn't work, I'll deparse them into character strings.
PS: I used the RDS format and haven't yet tested it with RData, but I suspect that the problem was the saving of the evalution environment in some of the attributes, and had nothing to do with the format in which the objects get saved. I'll post an update if it turns out that this doesn't also work with RData.
PPS: I suspect I'm not the only one here who's hearing about the R CMD INSTALL --fake trick for the first time, and perhaps the word should be spread about this... because to the extent other R users don't know about it, this remains an obvious vector for denial-of-service attacks against R!
I will accept my own answer to get rid of the SO auto-nagger, but will unaccept it and accept #baptiste if they make it possible for me to do so by posting it as an answer. Thanks.

Resources