R, calling Matrix package within function - r

I am using the package Matrix to create a sparse matrix inside a function that is contained in my personal R package, say
myfun <- function(x, ...){
d = Matrix(0, 5, 5, sparse = TRUE)
d[seq(1,25, by=x)] = 1
t(d)
}
( class(d) is "dgCMatrix" )
If I declare the function in the general environment and I run it from the console, everything works. However, when it is called from the package, I get
Error in t.default(D) : argument is not a matrix
Essentially R tries to call the t() function from the base environment, instead of the version of t() provided with the Matrix package, that possesses the methods to handle dgCMatrix class objects.
I tried explicitly calling library(Matrix) from within the function, but it does not help.
I had the same problem with another function, colSums(), and I solved it using Matrix::colSums() instead. However I would like to find a more general and practical remedy, instead of specifying the environment for every function.
I stress the fact that if I load the function in the general environment, R correctly loads the functions from Matrix.
Any idea on the source of the problem, or how I can force R to read from Matrix these functions?

Related

Using variable from another script (inter-file function closure)

I have script main.R, where I create inv_cov_mat variable. I later load metrics.R and use it to calculate function value (I use it as kind of inter-script function closure). I get error "object 'inv_cov_mat' not found". My code:
main.R:
knn <- function(...)
{
# some code
source("./source/metrics.R")
if (metric == "mahalanobis")
inv_cov_mat <- solve(cov(training_set))
# other code
# calculate distance in given metric between current vector and every row vector from training set matrix
distances <- apply(training_set, 1, metric, vec2=curr_vec) # error
metrics.R:
mahalanobis <- function(vec1, vec2)
{
diff <- vec1 - vec2
sqrt(t(diff) %*% inv_cov_mat %*% diff)
}
I've found simple, even if not elegant answer: use inv_cov_mat as global variable, not inside knn function. Then other scripts can see it.
It's not entirely clear what you want, but if I understand you correctly---you have a character string identifying the metric you want to use, and a function with the same name. So you should be able to use get to retrieve the function based on the name.
metric == "mahalanobis"
metric.fun = get(metric)
distances <- apply(training_set, 1, metric.fun, vec2=curr_vec)
That said, there are probably better ways to organize your code that would avoid this problem entirely, e.g. create a named list of functions for accessing metrics.
EDIT regarding the issue of inv_cov_mat, either pass it as an argument to your metric function or use get inside that function to access variables from the parent environment using the envir argument. Passing the variable as an argument to your metric function is definitely the better and cleaner approach.

Which R functions are based exactly on BLAS (ATLAS, LAPACK and so on)?

It is known that base R uses BLAS for calculation speedup. In my code I want to use those functions from base R and may be its packages that do use BLAS. How can I get the list of R functions which exactly use BLAS? Or how can I check whether the function I want to use in my code do use BLAS (ATLAS, LAPACK and so on)?
This is not a complete answer, as I am no expert in this. But maybe you or someone else can take some of these starting ideas and create a solution from it (would be great if you could post that then!)
Inspect R's functions if they are defined in C
Only functions which call C code are suspects for using BLAS. So finding out these functions could be a first step.
capture.output(print( FUN )) gives you the definition of a function as a string vector (one element per line) So to list all functions which are defined in terms of .Internal, .Primitive etc. do the following:
# Set this to the package you want to screen
envName <- 'base'
# Get the environment for the given name
env <- pos.to.env(which(search() == paste0('package:',envName)))
# Return TRUE if `string` contains `what`
contains <- function(string, what){
length(grep(what, string, fixed = TRUE)) != 0
}
# Build up a matrix which contains true if an element is defined in terms
# of the following functions which indicate calls to C code
signalWords <- c( '.Primitive', '.Internal', '.External'
, '.Call' , '.C' , '.Fortran' )
envElements <- ls(envir = env)
funTraits <- matrix(FALSE, nrow = length(envElements), ncol = length(signalWords),
dimnames = list(envElements, signalWords))
# Fill up the values of the matrix by reading each elements' definition
for (elementName in envElements){
element <- get(elementName, envir = env)
if(!is.function(element)){
next
}
fun.definition <- capture.output(print(element))
for(s in signalWords){
if(contains(fun.definition, s)){
funTraits[elementName, s] <- TRUE
}
}
}
When a function calls an external C function (as opposed to .Primitive functions) , it looks like this:
dnorm
## function (x, mean = 0, sd = 1, log = FALSE)
## .External(C_dnorm, x, mean, sd, log)
## <bytecode: 0x1b1eebfc>
## <environment: namespace:stats>
Then hunt down the object which is called by .External. It has the name of the corresponding C function. Use PACKAGE:::OBJECT$name to find it:
stats:::C_dnorm$name
## [1] "dnorm"
See further: How can I view the source code for a function?, it also information about where to find source code for compiled functions, and How to see the source code of R .Internal or .Primitive function?
Finally, you'll have to somehow screen the C code and all the functions it calls for BLAS routines...
LD_PRELOAD something which logs if a BLAS function is called
You could develop a DLL which has BLAS's names but just logs its usage and from where it has been called before forwarding the call to the real BLAS routines... The LD_PRELOAD UNIX environment variable can be used to load this library instead of BLAS. This only works if R was compiled to load BLAS as a dynamically linked library.
https://blog.netspi.com/function-hooking-part-i-hooking-shared-library-function-calls-in-linux/
See also: Why can R be linked to a shared BLAS later even if it was built with `--with-blas = lblas`?

Handling matrices using Brobdingnag package

I need to build a matrix with extremely small entries.
So far I realized that the fastest way to define the kind of matrix that I need is:
Define a vectorized function of coordinates:
func = function(m,n){...}
Combine every possible coordinate using outer:
matrix = outer(1:100,1:100,FUN=func)
Having to deal with extremely small numbers I work in func's environment using brob numbers, its output will therefore be of the same type of a brob:
typeof(func(0:100,0:100) )
[1] "S4"
If I directly plug two vectors 0:100 in my function func it returns a vector of brobs but if I try to use it with outer I get the error:
Error in outer(1:100, 1:100, FUN = func) : invalid first argument
I suppose this is because package Brobdingnag can somehow deal with vectors but not with matrices. Is it right? Is there any way to make it work?

How to avoid argument name collision in R snow/ snowfall package, sfSapply?

I'm trying to use the snow and snowfall packages, specifically the sfSapply() function to extract data from multiple raster files. It looks something like this:
queue <- list(rast1, rast2, rast3)
sfInit(parallel=TRUE, cpus=3)
sfLibrary(raster)
sfLibrary(rgdal)
sfLibrary(sp)
a <- sfSapply(queue, extract, sp=TRUE, fun=mean, y=tracts)
sfStop()
The fun argument passed to sfSapply() is intended for the extract() function (in the raster library). However, sfSapply() also takes a fun argument (extract()); in this example, I've provided it as the second positional argument.
How can I specify a fun argument for the passed function and not have it confused with the fun argument expected by sfSapply()?
One workaround is to create a custom extract function that has these arguments built in:
sfRasterExtract=function(raster_obj){
extract(raster_obj, sp=TRUE, fun=mean, y=tracts)
}
Make sure to use sfExportAll() after sfInit to import the function to all instances you will use, then run
a <- sfSapply(queue, sfRasterExtract)

S4 object in R cannot be passed to Fortran

I use the bdiag function in the Matrix package in R to generate diagonal matrix, and then I pass the resultant matrix (called mat) into a self-written function but R fails to execute due to the following error:
Error: invalid mode (S4) to pass to Fortran (arg 1)
I checked isS4(mat) and it's TRUE. Thus, I guess there is a way to convert the S4 object somehow in order to be passed to the function. Any advice will be greatly appreciated!
UPDATE: I use the following codes for constructing the block diagonal matrix:
grp.ids <- as.factor(c(rep(1,8), rep(2,4), rep(3,2)))
x <- model.matrix(~grp.ids)
X <- do.call(bdiag, replicate(238, x, simplify=FALSE))
Is there any other way to get a S3 matrix without using the bdiag function? Thanks!
Only the .Call() interface can pass full R objects down to C or C++ code, see Section 5 of the Writing R Extensions manual. With .Fortran() and .C() you are limited to basic vectors of int, double, ... and their corresponding Fortran types.

Resources