I have created three procedure inside of the package. Now want to compile the only two procedure inside of the package.
Is it Possible, If yes means How?
It's not possible. You compile can compile the package body separate from it's header, but not the individual methods in the package body, only the thing as a whole.
You can recompile it using the alter package statement. In its simplest form:
ALTER PACKAGE YourPackage COMPILE PACKAGE; -- Whole package
ALTER PACKAGE YourPackage COMPILE SPECIFICATION; -- Spec/header only
ALTER PACKAGE YourPackage COMPILE BODY; -- Body only
Related
I have already made a simple R package (pure R) to solve a problem with brute force then I tried to faster the code by writing the Rcpp script. I wrote a script to compare the running time with the "bench" library. now, how can I add this script to my package? I tried to add
#'#importFrom Rcpp cppFunction
on top of my R script and inserting the Rcpp file in the scr folder but didn't work. Is there a way to add it to my r package without creating the package from scratch? sorry if it has already been asked but I am new to all this and completely lost.
That conversion is actually (still) surprisingly difficult (in the sense of requiring more than just one file). It is easy to overlook details. Let me walk you through why.
Let us assume for a second that you started a working package using the R package package.skeleton(). That is the simplest and most general case. The package will work (yet have warning, see my pkgKitten package for a wrapper than cleans up, and a dozen other package helping functions and packages on CRAN). Note in particular that I have said nothing about roxygen2 which at this point is a just an added complication so let's focus on just .Rd files.
You can now contrast your simplest package with one built by and for Rcpp, namely by using Rcpp.package.skeleton(). You will see at least these differences in
DESCRIPTION for LinkingTo: and Imports
NAMESPACE for importFrom as well as the useDynLib line
a new src directory and a possible need for src/Makevars
All of which make it easier to (basically) start a new package via Rcpp.package.skeleton() and copy your existing package code into that package. We simply do not have a conversion helper. I still do the "manual conversion" you tried every now and then, and even I need a try or two and I have seen all the error messages a few times over...
So even if you don't want to "copy everything over" I think the simplest way is to
create two packages with and without Rcpp
do a recursive diff
ensure the difference is applied in your original package.
PS And remember that when you use roxygen2 and have documentation in the src/ directory to always first run Rcpp::compileAttributes() before running roxygen2::roxygenize(). RStudio and other helpers do that for you but it is still easy to forget...
I am writing an R package that should be able to compile C++ code on the fly. In practice, users can define, at run-time, operators based on C++ code that is compiled and then used in computation (for efficiency purpose, like PyTorch or TensorFlow models in Python). Ideally, the code compiled at run-time should use Rcpp features to be exported to R.
Example:
In my R package, I have a function def_operator that can parse some mathematical formula defining an operator.
my_custom_op <- def_operator("x+y", args = c("x", "y"))
My Cpp API knows how to generate the Cpp code associated to this formula. This code should be compiled on the fly (just once, not at each call).
The user can use this new function to do some computations.
res <- my_custom_op(1, 3) # should give 4
Note: this is an example, the operators defined by the user aim at doing more some adding scalar numbers, and the interest is clearly to let the user defines its operators and not to pre-define some generic operators compiled at installation.
I know two things for the moment:
the Cpp code required to generate the operators (which is not compiled at installation) should be put in the inst package directory, it will be copied at installation and I can find where with the R function find.package.
I can use the function sourceCpp to compile code on the fly. Thus I can define some functions in Cpp that will be automatically exported to R and be callable there. It is even possible to keep the shared library to avoid multiple compilations (see Rcpp: how to keep files generated by sourceCpp?)
Here are my questions:
Do you know some alternative to sourceCpp from the Rcpp package to compile C++ code on the fly and export it to R?
Is there some way to manage compilation option for sourceCpp other than using the file ~/.R/Makevars (I need to link the code in the inst directory and I don't want to edit this file on the user system)?
Eventually, do you know some R packages implementing compilation on the fly that I could take as examples?
Do you know some alternative to sourceCpp from the Rcpp package to compile C++ code on the fly and export it to R?
Using sourceCpp() is the best approach. Alternatively, you can use its predecessor from the inline R package. Otherwise, you will need to build your own file via R CMD SHLIB, load the library, and create a wrapper yourself. (Not fun.)
Is there some way to manage compilation option for sourceCpp other than using the file ~/.R/Makevars (I need to link the code in the inst directory and I don't want to edit this file on the user system)?
Yes, there are many Makevars variables that can be set per R session via Sys.setenv("PKG_LIBS" = ...).
Now, to retrieve a file location dynamically, consider RcppMLPACK1's flag function approach.
Eventually, do you know some R packages implementing compilation on the fly that I could take as examples?
There are a couple entrants in this market:
armacmp package by Dirk Schumacher that translates R code to C++ under the armadillo library.
nCompiler package by Perry de Valpine et al. for code-generating C++ and easily interfacing between R and C++.
I'm building a package that uses two main functions. One of the functions model.R requires a special type of simulation sim.R and a way to set up the results in a table table.R
In a sharable package, how do I call both the sim.R and table.R files from within model.R? I've tried source("sim.R") and source("R/sim.R") but that call doesn't work from within the package. Any ideas?
Should I just copy and paste the codes from sim.R and table.R into the model.R script instead?
Edit:
I have all the scripts in the R directory, the DESCRIPTION and NAMESPACE files are all set. I just have multiple scripts in the R directory. ~R/ has premodel.R model.R sim.R and table.R. I need the model.R script to use both sim.R and table.R functions... located in the same directory in the package (e.g. ~R/).
To elaborate on joran's point, when you build a package you don't need to source functions.
For example, imagine I want to make a package named TEST. I will begin by generating a directory (i.e. folder) named TEST. Within TEST I will create another folder name R, in that folder I will include all R script(s) containing the different functions in the package.
At a minimum you need to also include a DESCRIPTION and NAMESPACE file. A man (for help files) and tests (for unit tests) are also nice to include.
Making a package is pretty easy. Here is a blog with a straightforward introduction: http://hilaryparker.com/2014/04/29/writing-an-r-package-from-scratch/
As others have pointed out you don't have to source R files in a package. The package loading mechanism will take care of losing the namespace and making all exported functions available. So usually you don't have to worry about any of this.
There are exceptions however. If you have multiple files with R code situations can arise where the order in which these files are processed matters. Often it doesn't matter or the default order used by R happens to be fine. If you find that there are some dependencies within your package that aren't resolved properly you may be faced with a situation where a custom processing order for the R files is required. The DESCRIPTION file offers the optional Collate field for this purpose. Simply list all your R files in the order they should be processed to satisfy the dependencies.
If all your files are in R directory, any function will be in memory after you do a package build or Load_All.
You may have issues if you have code in files that is not in a function tho.
R loads files in alphabetical order.
Usually, this is not a problem, because functions are evaluated when they are called for execution, not at loading time (id. a function can refer another function not yet defined, even in the same file).
But, if you have code outside a function in model.R, this code will be executed immediately at time of file loading, and your package build will fail usually with a
ERROR: lazy loading failed for package 'yourPackageName'
If this is the case, wrap the sparse code of model.R into a function so you can call it later, when the package has fully loaded, external library too.
If this piece of code is there for initialize some value, consider to use_data() to have R take care of load data into the environment for you.
If this piece of code is just interactive code written to test and implement the package itself, you should consider to put it elsewhere or wrap it to a function anyway.
if you really need that code to be executed at loading time or really have dependency to solve, then you must add the collate line into DESCRIPTION file, as already stated by Peter Humburg, to force R to load files order.
Roxygen2 can help you, put before your code
#' #include sim.R table.R
call roxygenize(), and collate line will be generate for you into the DESCRIPTION file.
But even doing that, external library you may depend are not yet loaded by the package, leading to failure again at build time.
In conclusion, you'd better don't leave code outside functions in a .R file if it's located inside a package.
Since you're building a package, the reason why you're having trouble accessing the other functions in your /R directory is because you need to first:
library(devtools)
document()
from within the working directory of your package. Now each function in your package should be accessible to any other function. Then, to finish up, do:
build()
install()
although it should be noted that a simple document() call will already be sufficient to solve your problem.
Make your functions global by defining them with <<- instead of <- and they will become available to any other script running in that environment.
I am new to PL/SQL so just trying to figure out the general flow of creating a package
CREATE OR REPLACE PACKAGE P1 AS
PROCEDURE PROC1
(
);
END P1;
CREATE OR REPLACE PACKAGE BODY P1 AS
//package definition
END P1;
Is this the correct way to define the package?
Basically, I am trying to find out whether I can declare the package and define the package body in the same file or would i need to create 2 separate files?
When I try to execute it, I get the error Encountered the word 'PROCEDURE' when expecting one of the following
Generally, you have the specification of the package in a file, and the body in another. Why are you trying to put them in the same file?
It doesn't matter for an individual package, but you must declare the specification before the body.
Where you are creating multiple packages it is best to create all of the specifications first because the bodies can then compile even if they reference a different package for which the body has not yet been created.
I have built an R package but I do not want my users to have to install it before using it.
Is there a way to load a package without having to install it?
For example, if I have a package mypackage.tar.gz, is there something like
library("mypackage.tar.gz")
?
I'll join in "the chorus" of suggesting you should really install the package.
That having been said, you can take a look at Hadley's devtools package, which will let you load packages into the workspace without dumping in your global workspace.
The package will have to be untar'd/unzipped and follow the standard R package structure.
In order for this to work, though, your users would have to have the devtools package installed, so ... I'm not sure that this is any type of win for you.
If you only need the code to be loaded without it being installed, take the raw R script and source it:
source(myScript.R)
If you have different functions, you can create an R script that just loads all the necessary source files. What I sometimes do when developing, is name all my functions F_some_function.R and my classes Class_some_function.R. This allows me to source a main file containing following code :
funcdir <- "C:/Some/Path"
files <- dir(funcdir)
srcfiles <- c(grep("^Class_",dir(funcdir),value=T),
grep("^F_",dir(funcdir),value=T)
)
for( i in paste(funcdir,srcfiles,sep="/")) source(i)
If you present them with the tarred file, they can untar themselves using untar() before sourcing the main file.
But honestly, please use a package. You load everything in the global environment (or in a specified environment if you use local=T), but you lose all functionality of a package. Installing a package is no hassle, and removing neither.
If it's a matter of writing rights on the C drive (which is the only possible reason not to use a package that I met in my carreer), one can easily set another library location. R 2.12 actually does this by itself on Windows. See ?.libPaths()