Is there a way how to call roxygen templates from a different location, other than the current package?
Use Case
I have a lot of packages. Many of them use the same templates for documentation.
If a change is needed in one of the templates, I would like to do it in one place and not have to update multiple packages.
Ideal Solution (in my view at the moment)
I have a package, that I'm using for utility functions used across multiple packages. As a part of that package I would have roxygen templates in man-roxygen folder. And I would like to reference templates from this location in other packages.
Insufficient solutions
Symbolic link to a directory: Not solving the case when some extra templates are needed for individual packages, which shouldn't be in the central repository. (unless I can have more man-roxygen like directories)
Symbolic links to individual files: A hassle of creating such a stuff. And mainly symbolic links are not a solution when SVN is used on multiple platforms including Windows.
Templates inside calling a packageABC::functionXYZ() that returns the text. Cumbersome and it doesn't solve the need to copy files to individual packages.
Related
When developing packages in R all R source files are put in the subdirectory R/, and all compiled code is put in the subdirectory src/.
I would like to add some organisation to files within these folders, rather than have everything dumped at the top level. For example, lets say I'm hypothetically developing a client-server application. Logically, I would like to organise all my client R source files in R/client/ and all my server R source files in R/server/.
Is it possible to organise code in subfolders when developing a package, and if so, how? The Writing R Extensions manual doesn't offer any guidance, nor does R CMD build detect files stored in subfolders under R/.
You can't use subfolders without additional setup (like defining a custom makefile). The best you can do is to use prefixes: client-a.r, client-b.r, server-a.r, server-b.r, etc.
Expanding the comment to Hadley's IMHO incorrect answer:
Look at the Matrix package (written by R Core members) which has five folders below src/, and two of these contain other subfolders. Other example is the Rsymphony packages (co-)written and maintained by an R Core member.
Doing this is not for the faint of heart. R strongly prefers a src/Makevars fragment over a full src/Makefile in order to be able to construct its own Makefile versions for the different subarchitectures. But if you know a little make and are willing to put the effort in, this is entirely doable -- and being done.
That still does not make it recommended though.
I argued with R core team Allow for sub-folders in "package/R/" directory . They seem not really want improve it. So my workflow is as follows.
1) Create an R project same as other packages but allow sub-directories in folder R/ such as
R/mcmc/a.R
R/mcmc/b.R
R/prediction/p1.R
R/predection/p2.R
2) When I need to pack them, I convert all files under R/ as
R/mcmc_a.R
R/mcmc_b.R
R/prediction_p1.R
R/predection_p2.R
...
with my package.flatten() function
3) Then I install the flattened version to R.
I wrote a simple script for Linux to do everything
https://github.com/feng-li/flutils/blob/master/inst/bin/install.HS
Recognizing the thread is a bit old, I just thought I'd throw in my solution to this problem. Note that my issue is similar, but I am only concerned with preserving folder hierarchies in development.
In development, I organize my script files in subfolders to my heart's content, but rather than fight R's flat hierarchy in production, I added my own "compile-time constant", so to speak.
That is, in every file located in a subfolder (not in top-level scripts/), I add the following:
if (!exists("script.debug"))
script.debug = FALSE
Then, I load whatever other dependencies are required as follows:
source.list <- c(
"script_1.R",
"script_2.R",
"script_3.R",
"script_4.R"
)
if (script.debug)
source.list <- paste("./script_subfolder/", source.list, sep="")
lapply(source.list, source)
The default assumption is that the code is in production, (source.debug = FALSE), so when in development, just ensure that source.debug = TRUE and the project's script/ folder is set as the working directory before loading any script files.
Of course, this example's a bit simple - it assumes that all script file dependencies exist in the same folder, but it seems a simple issue to devise a system that would suit more complicated development folder hierarchies.
I install all 3rd party apps in "/opt" which I need to install manually i.e without any package manager.
So to use all those manually installed apps from TERMINAL I need to add them in PATH variable.
But I find that PATH variable should not be longer otherwise it can make system slower, very negligible but it will. So I added symlinks of the executables in a path which is already added in PATH variable like "/usr/bin".
My question is I didn't find any side effects of this technique, it's working well.
But I want to know if there will be any problem later by doing this. As far I know "/usr/bin", "/bin", "/sbin" this folders are managed by package managers. So will it make any problem to package managers by adding symlinks like this?
That would only confuse package managers if the packages might use those same pathnames. There are a few drawbacks to the symlink approach:
you have to maintain the links (and remember to remove broken links if you remove the packages which they correspond to).
occasionally you will encounter a package which is run via a shell script that checks to see where it is run from (such as the symbolic link in /usr/bin) and then assumes that all of its other parts are available on the same path (again /usr/bin). If that extends into sibling directories such as /usr/lib, it can be tedious to manage the links.
One way to deal with the links would be to make a meta-package of your own which installs the packages and adds your own post-install/uninstall rules to maintain the links.
When developing packages in R all R source files are put in the subdirectory R/, and all compiled code is put in the subdirectory src/.
I would like to add some organisation to files within these folders, rather than have everything dumped at the top level. For example, lets say I'm hypothetically developing a client-server application. Logically, I would like to organise all my client R source files in R/client/ and all my server R source files in R/server/.
Is it possible to organise code in subfolders when developing a package, and if so, how? The Writing R Extensions manual doesn't offer any guidance, nor does R CMD build detect files stored in subfolders under R/.
You can't use subfolders without additional setup (like defining a custom makefile). The best you can do is to use prefixes: client-a.r, client-b.r, server-a.r, server-b.r, etc.
Expanding the comment to Hadley's IMHO incorrect answer:
Look at the Matrix package (written by R Core members) which has five folders below src/, and two of these contain other subfolders. Other example is the Rsymphony packages (co-)written and maintained by an R Core member.
Doing this is not for the faint of heart. R strongly prefers a src/Makevars fragment over a full src/Makefile in order to be able to construct its own Makefile versions for the different subarchitectures. But if you know a little make and are willing to put the effort in, this is entirely doable -- and being done.
That still does not make it recommended though.
I argued with R core team Allow for sub-folders in "package/R/" directory . They seem not really want improve it. So my workflow is as follows.
1) Create an R project same as other packages but allow sub-directories in folder R/ such as
R/mcmc/a.R
R/mcmc/b.R
R/prediction/p1.R
R/predection/p2.R
2) When I need to pack them, I convert all files under R/ as
R/mcmc_a.R
R/mcmc_b.R
R/prediction_p1.R
R/predection_p2.R
...
with my package.flatten() function
3) Then I install the flattened version to R.
I wrote a simple script for Linux to do everything
https://github.com/feng-li/flutils/blob/master/inst/bin/install.HS
Recognizing the thread is a bit old, I just thought I'd throw in my solution to this problem. Note that my issue is similar, but I am only concerned with preserving folder hierarchies in development.
In development, I organize my script files in subfolders to my heart's content, but rather than fight R's flat hierarchy in production, I added my own "compile-time constant", so to speak.
That is, in every file located in a subfolder (not in top-level scripts/), I add the following:
if (!exists("script.debug"))
script.debug = FALSE
Then, I load whatever other dependencies are required as follows:
source.list <- c(
"script_1.R",
"script_2.R",
"script_3.R",
"script_4.R"
)
if (script.debug)
source.list <- paste("./script_subfolder/", source.list, sep="")
lapply(source.list, source)
The default assumption is that the code is in production, (source.debug = FALSE), so when in development, just ensure that source.debug = TRUE and the project's script/ folder is set as the working directory before loading any script files.
Of course, this example's a bit simple - it assumes that all script file dependencies exist in the same folder, but it seems a simple issue to devise a system that would suit more complicated development folder hierarchies.
I'm creating a javascript library that I want make available through Bower to my internal company. I'm using Grunt to build my library.
My issue is that grunt's convention is to use package.json to define dependencies, library versions, dependencies, etc.
Bower, on the other hand, assumes that that same information is found in a component.json file.
What's the intended use of these two? They seem to serve essentially the same purpose. Do I need to create both and cut and paste the shared information?
We've gotten a lot of these kinds of question and everyone assumes we could share a lot metadata between these formats, but the reality is that only the name and version fields are sharable and only the version field changes regularly. If you find it cumbersome having to update two fields when you release something, there are tools out there that can automate this, eg. grunt-bumpx.
package.json is intended for back-end purposes, in this case specify grunt tasks, node dependencies, etc. In the other side, bower.json is intended for front-end purposes.
I am experimenting with my own BSD or Linux distribution. I want to organize the system files in a way that makes sense to an end user. I want them to have access to the system without all the file clutter that *nixes leave around.
Is there a way to merge several dynamic libraries into a single file without losing dynamic linking? I will have access to all of the source files.
It might be system-dependent, but at least with ELF (the executable format used by Linux), this is not possible. With ELF, shared libraries are a bit like executables: they are a final product of the linking process and are not designed to be decomposed or relinked into a different arrangement.
If you have the source for all of the components that go into a bunch of shared libraries, I suppose you could link them all together into one giant shared library, but you would use object files (*.o) or archive libraries (*.a) as the input to produce such a library.
As alluded to in comments, there is unlikely to be a good reason to want to actually do this.