Add function docstrings with Documenter.jl - julia

I'm trying to create a documentation for a Julia package using Documenter.jl. (I'm fairly new to Julia in general though.) Now I was wondering how I can get all functions with a docstring appear in the documentation? I know I can add a function f via
## Functions
```#docs
f
```
in the markdown file but that would mean I have to add all functions manually. Isn't there a way of adding all functions automatically?

Ok for anyone interested I just found out how to do it:
```#autodocs
Modules = [Foo, Bar]
Order = [:function, :type]
```
as shown in https://juliadocs.github.io/Documenter.jl/stable/man/syntax/##docs-block.

Related

source() functions preventing the read of downstream functions

I am trying to teach myself R and coming from a Python programming background.
I am clearly having problems with sourcing in one file (file_read_functions.R) when the functions stored in it are called from a file in the same directory (from read_files.R).
file_read_functions is as follows:
constant_source <- 'constants.R'
function_source <- 'file_read_functions.R'
class_source <- 'classes.R'
source(class_source)
source(constant_source)
source(function_source)
cellecta_counts = read_cellecta_counts(filepath = cell_counts_by_gene_id)
file_read_functions.R is as follows:
constants <- 'constants.R'
classes <- 'classes.R'
assignments <- 'assignment_functions.R'
source(constants)
source(classes)
source(assignments)
read_cellecta_counts = function(filepath) {
print("hello")
return(filepath)}
With the above, if I move read_cellecta_counts to before the source functions, the code can successfully find the function. What might be the cause?
This seems like a straight-forward error message to me. The function object wasn't found, so that means you haven't defined it anywhere, or haven't loaded it.
If it's a function from a package, maybe you forgot to load the package, or call the function as package::function(). If it is a function you wrote as a simple script, maybe you forgot to source it or define it locally. If it's a function you wrote as part of a package, you can load all functions by using the shortcut CTRL+SHIFT+L in RStudio.
That said, I believe you can benefit a lot from reading the chapter on Debugging from Hadley Wickham's "Advanced R" book. It is really well written and easy to understand, especially for beginners in the R language. The chapter will teach you how to use some debugging tools, either interactively or not. You can find it here.
I found it out. By commenting out the parts piece-wise in file_read_functions, I found that there a typo within assignment_functions.R. It was a simple typo; I had the following:
constant_source <- 'constants.R'
source(constants_source)
The extra 's' did it.
If you get an error like this in the future, be sure to check all of the upstream source() files; if there is an error in any of those it will not be able to find any proceeding functions.
Thank you all for your patience.

Using an R Markdown Document as a source for functions

I'm looking into R Markdown for documenting functions I regularly use. I will put them into an R Markdown file to document them and then be able to read my thinking behind the function if i come back to it months later
My question is, if i start a new R project, Is it possible to source the r markdown file and use the library of functions i have created just by calling them similarly to if i was sourcing a regular R file. I dont really wish to maintain two sets of function files
I appreciate this may be a beginners question but any help pointing to tutorials and the like would be greatly appreciated
Thanks
As was mentioned in the comments, you should probably create a package for this purpose. But if you insist on putting function definitions in scripts and document them using RMarkdown files, using read_chunk() from the knitr package might be the way to go.
Note that this approach differs slightly from what you requested. You wanted to have the function definition in the markdown file together with the documentation. And then you wanted to somehow source that file into your R script in order to use the function. I did not find a way to do this (even though it might be possible).
The alternative that I propose puts the function definition in its own R script, say fun.R. The Rmarkdown file then reads the function definition from fun.R and adds documentation. If you want to use the function in some other script, you can simply source fun.R (and not the markdown file). This still means that you have to maintain the code for the function definition only once.
So let me show this with an example. This is fun.R:
## ---- fun
fun <- function(x) x^2
The first line is an identifier that will be used later. The markdown file is as follows:
---
title: "Documentation of fun()"
output: html_document
---
This documents the function `fun()` defined in `fun.R`.
```{r,cache = FALSE}
knitr::read_chunk("fun.R")
```
This is the function definition
```{r fun}
```
This is an example of how to use `fun()`:
```{r use_fun}
fun(3)
```
The first chunk reads in fun.R using knitr::read_chunk. Later on, you can define an empty chunk that has the identifier that was used in fun.R as its name. This will act as if the contents of fun.R were written directly in this file. As you see, you can also use fun() in later chunks. This is a screenshot of the resulting html file:
In a script where you want to use fun() you simply add source("fun.R") to source the function definition.
You could also have several functions in a single R file and still document them separately. Simply put an identifier starting with ## ---- before each function definition and then create empty chunks referring to each one of the identifiers.
This is admittedly somewhat more complicated than what you asked for, because it involves two files instead of just one. But at least there is no redundancy
The klmr/modules package has been superseded by the box package by the same author. It is on CRAN. After the cat command below run these lines to display the roxygen2 help for add2.
box::use(./test)
box::help(test$add2)
Perhaps this is close enough -- you can use the github klmr/modules package (not the CRAN modules package) to combine roxygen2 documentation and code in a single file without creating a package. For example, after installing the modules package copy this to the clipboard and then paste it into the R console to create a single file module with embedded documentation. The subsequent code then imports it, runs a function from it and invokes help. See the documentation of the modules package for more info.
Note that this has the following advantages: (1) everything is in a single file, (2) if you later decide to move to using packages you can use the very same file in your package with roxygen2 -- no need to revise anything, (3) any learning of roxygen2 applies to packages too.
# create a file with our documentation and code
Lines <- "
#' Add two numbers.
#'
#' #param x the first number.
#' #param y the second number.
#' #return The sum.
#' #note This is just a simple example.
#'
#' This function is a simple example intended to show how to use the modules
#' package with roxygen2.
add2 <- function(x, y) x + y
"
cat(Lines, file = "test.R")
# now we can import it
# devtools::install_github("klmr/modules")
library(modules)
test <- import("test") # do not include the .R extension
test$add2(1, 2)
## [1] 3
# this will cause help page to appear
?test$add2

How to create help for my functions written by in R?

I want to create help pages for my own functions. So if I define a function: e.g:
myfunct <- function(x,y,z){...}
And if I type ?myfunct I'd get the parameter specifications that I give.
Is there a way to do that?
You should create a package.
I suggest to use the Roxygen package for documentation.
http://roxygen.org/
http://cran.r-project.org/doc/manuals/R-exts.html
In addition to ESS both rstudio http://www.rstudio.com/ide/ and statet http://www.walware.de/goto/statet support it.

Adding a function in R for use without loading it

So I have code for a custom function in R, e.g. simply:
f <- function(x) return(x)
What I want to do is be able to call this function f as if it came with default R. That is, I don't want to have to source() the file containing the code to be able to use the function. Is there a way to accomplish this?
What you are looking to do is documented extensively under ?Startup. Basically, you need to customize your Rprofile.site or .Rprofile to include the functions that you want available on startup. A simple guide can be found at the Quick-R site.
One thing to note: if you are commonly sharing code with others, you do need to remember what you've changed in your startup options and be sure that the relevant functions are also shared.

Print the sourced R file to an appendix using Sweave

I keep R and Rnw files separate, then load the R data/plots with load("file.R") in the first Sweave chunk. Is there a way that I can print the sourced R file to an appendix without executing all of the code? (i.e., the code is slow enough that I don't want to source() it in an echo=TRUE chunk).
Thanks!
Update -- actually, I don't think my source() idea works.
How about using a Latex package?
Add into your header
\usepackage{fancyvrb}
Then
\VerbatimInput{yourRfile.R}
You can use highlight package to output nicely formatted, colorful code:
highlight("myRfile.R", renderer = renderer_latex(document = F))
But don't forget to put in your latex doc the lengthy preamble which you get with document=T.
You can experiment with code directly:
highlight(output="test.tex",
parser.output = parser(text = deparse(lm)),
renderer = renderer_latex(document = T))
And get
I usually solve this by:
\begin{appendix}
\section{Appendix A}
\subsection{R session information}
<<SessionInforamtaion,echo=F,eval=T,results=tex>>=
toLatex(sessionInfo())
#
\subsection{The simulation's source code}
<<SourceCode,echo=F,eval=T>>=
Stangle(file.path("Projectpath","RnwFile.Rnw"))
SourceCode <- readLines(file.path("Projectpath","Codefile.R"))
writeLines(SourceCode)
#
\end{appendix}
Using this you have to think of a maximum numbers of characters per line.
Separating R and Rnw files sort of defeats the purpose of literate programming. My own approach is to include the code chunks at the appropriate place in the text. If my audience isn't interested in the code, then I might mark it as
<<foo, echo=FALSE>>=
x <- 1:10
#
I might assemble the code in an appendix as
<<appendix-foo, eval=FALSE>>=
<<foo>>
#
which I admit is a bit of a kludge and error prone (forgotten chunks). One quickly wants to bundle the document with supporting material (data sets, useful helper functions, non-R scripts) into an R package, and these are not difficult to create. Building the package automatically creates the pdf and Stangle'd R file, which is exactly what you want. Package building can be a slow process, but installing the package does not require that the vignettes be rebuilt and so is fast and convenient for whomever you're giving the package to.
For twiddling with formatting / text, I use a global option \SweaveOpts{eval=FALSE}.

Resources