I'm trying to build a documentation using documenter.jl and run into a weird error.
In one of the .md files, I have a few #docs blocks. It looks as follows:
##Functions
```#docs
f
```
##Some more functions
```#docs
g
```
(Neither f nor g comes with any args specification in the .md file). Now documenter can find the docstring for g, but it cannot find the one for f. The docstring looks as follows:
"""
f
does some stuff.
"""
f and g don't both come from the same file, but both from files that are included in my_module.jl via include("file").
The error message reads:
Warning: no docs found for 'f' in `#docs` block in src\reference.md
```#docs
f
```
I am really clueless what is happening here, does anyone have an idea?
EDIT: on github, documenter builds the documentation just fine. Only locally, it cannot find the docstrings. Is that a bug of documenter? Or could it be caused by some local settings?
Related
can someone please help me understand this:
I encountered an error when calling a function from a library, specifically "steinertree" from the "SteinerNet" package. When stepping into the function with debug(steinertree), I see that the error occurs, when the function in turn calls "steinertree3". When I try debug(steinertree3), I get "object 'steinertree3' not found". Similarly, I can get the code for 'steinertree' by typing it in the terminal, but not for 'steinertree3'.
So it seems to me that there are some "higher-level" functions and "hidden" functions in packages. I did eventually find the error by finding a file "steinertree.R" in the package at CRAN, which contains both 'steinertree' and 'steinertree3', but I`m wondering how to properly go about debugging such "hidden" functions.
Here is a simple example:
library(igraph)
library(SteinerNet)
set.seed(1)
g= erdos.renyi.game(n=10,p.or.m=0.2)
plot(g)
steinertree(type= 'KB', terminals= c(1,3), graph= g)
Thank you!
Use triple colon ::: to execute a function that is not exported by the package/namespace:
package:::hidden_function()
I have a Julia module file with a function, a doc-string, and a doc-test. I load it and the doc-string shows in the Julia help, but Documenter.jl cannot find the doc-string.
An example module file, src/my_module.jl, is:
module my_module
"""
add(x, y)
Dummy function
# Examples
```jldoctest
julia> add(1, 2)
3
```
"""
function add(x::Number, y::Number)
return x + y
end
end
The make file docs/make.jl is:
using Documenter, my_module
makedocs(
modules = [my_module],
format = :html,
sitename = "my_module.jl",
authors = "unknown",
doctest = true
)
The ouptut of include("src/my_module.jl"), then ?, then my_module.add, shows that the Julia REPL found the docstring:
help?> my_module.add
add(x, y)
Dummy function
Examples
≡≡≡≡≡≡≡≡≡≡
julia> add(1, 2)
3
The output of include("docs/make.jl") shows that Documenter did not:
Documenter: setting up build directory.
Documenter: expanding markdown templates.
Documenter: building cross-references.
Documenter: running document checks.
> checking for missing docstrings.
!! 1 docstring potentially missing:
my_module.add :: Tuple{Number,Number}
> running doctests.
> checking footnote links.
Documenter: populating indices.
Documenter: rendering document.
How come the Julia REPL finds the docstring and not Documenter?
Notes: I ran Pkg.update() before running the code. Documenter has version 0.18.0, Julia has version 0.6.3.
As mentioned in the comments of #fredrikekre, I was missing #autodocs and some other details. Here is a complete setup for doc-testing in Julia with Documenter.jl.
Directory structure of my_module (from command tree, reordered for clarity):
.
|____src
| |____my_module.jl
| |____my_functions.jl
|____docs
| |____make.jl
| |____src
| | |____index.md
|____README.md
|____REQUIRE
The file src/my_module.jl is:
module my_module
# export functions you want to call without qualifications
export add_exported
using DataFrames # or any other module
# Include functions
include("my_functions.jl")
end
The file src/my_functions.jl contains exported and non-exported functions. Note how the doc-test of exported functions has no qualification, and the doc-test of non-exported functions does:
"""
add_exported(x, y)
Dummy function, exported
# Examples
```jldoctest
julia> add_exported(1, 2)
3
```
"""
function add_exported(x::Number, y::Number)
return x + y
end
"""
add_not_exported(x, y)
Dummy function, not exported
# Examples
```jldoctest
julia> my_module.add_not_exported(1, 2)
3
```
"""
function add_not_exported(x::Number, y::Number)
return x + y
end
The file docs/make.jl is:
using Documenter, my_module
makedocs(
modules = [my_module],
format = :html,
sitename = "my_module.jl",
doctest = true
)
The file docs/src/index.md contains using my_module, which brings exported functions into scope:
# Documentation
```#meta
CurrentModule = my_module
DocTestSetup = quote
using my_module
end
```
```#autodocs
Modules = [my_module]
```
The last two files are optional. The file REQUIRE serves only for remote installation of package. It contains:
julia 0.6.3
DataFrames 0.11.6
The file README.md contains a description in Markdown:
# my_module and its description
Finally, change directory to the root of the package, start a Julia session, and type:
julia> include("src/my_module.jl");include("docs/make.jl");
Documenter: setting up build directory.
Documenter: expanding markdown templates.
Documenter: building cross-references.
Documenter: running document checks.
> checking for missing docstrings.
> running doctests.
> checking footnote links.
Documenter: populating indices.
Documenter: rendering document.
If you change the result of add in the doc-test from 3 to any other number, the Documenter will show an error and show that it is working.
I am having problems understanding the (practical) difference between the different ways to externalise code in R notebooks. Having referred to previous questions or to the documentation, it is still unclear the difference in sourcing external .R files or read_chunk() them. For practical purposes let us consider the below:
I want to load libraries with an external config.R file: the most intuitive way, according to me, seems to create config.R as
library(first_package)
library(second_package)
...
and, in the general R notebook (say, main.Rmd) call it like
```{r}
source('config.R')
```
```{r}
# use the libraries included above
```
However, this does not recognise the packages included, so it seems that sourcing an external config file is useless. Likewise using read_chunk() instead. Therefore the question is: How to include libraries at the top, so that they are recognised in the main markdown script?
Say I want to define global functions externally, and then include them in the main notebook: along the same lines as above one would include them in an external foo.R file and include them in the main one.
Again, it seems that read_chunk() does not do the job, whereas source('foo.R') does, in this case; the documentation states that the former "only evaluates code, but does not execute it": when is it ever the case that one wants to only evaluate the code but not execute it? Differently posed: why would one ever use read_chunk() rather than source, for practical purposes?
This does not recognise the packages included
In your example, first_package and second_package are both available in the working environment for the second code chunk.
Try putting library(nycflights13) in the R file and head(airlines) in the second chunk of the Rmd file. Calling knit("main.Rmd") would fail if the nycflights13 package wasn't successfully loaded with source.
read_chunk does in fact accomplish this (along with source) however they go about it differently. With source you will have the global functions available directly after the source (as you have found). With read_chunk however, as you pointed out since it only evaluates code, but does not execute it you need to explicitly execute the chunk and then the function will be available. (See my example with third_config_chunk below. Including the empty chunk of third_config_chunk in the report allows the global some_function to be called in subsequent chunks.)
Regarding "only evaluates code, but does not execute it", this is an entire property of R programming known as lazy evaluation. The idea being that you may want to create a number of functions or template code which is read into your R environment but is not executed on-the-spot, allowing you to modify the environment/parameters prior to evaluation. This also allows you to execute the same code chunks multiple times whereas source will only run once with what is already provided.
Consider an example where you have an external R script which contains a large amount of setup code that isn't needed in your report. It is possible to format this file into many "chunks" which will be loaded into the working environment with read_chunk but won't be evaluated until explicitly told.
In order to externalise your config.R using read_chunk() you would write the R script as:
config.R
# ---- config_preamble
## setup code that is required for config.R
## to run but not for main.Rmd
# ---- first_config_chunk
library(nycflights13)
library(MASS)
# ---- second_config_chunk
y <- 1
# ---- third_config_chunk
some_function <- function(x) {
x + y
}
# ---- fourth_config_chunk
some_function(10)
# ---- config_output
## code that is output during `source`
## and not wanted in main.Rmd
print(some_function(10))
To use this script with the externalisation methodology, you would setup main.Rmd as follows:
main.Rmd
```{r, include=FALSE}
knitr::read_chunk('config.R')
```
```{r first_config_chunk}
```
The packages are now loaded.
```{r third_config_chunk}
```
`some_function` is now available.
```{r new_chunk}
y <- 20
```
```{r fourth_config_chunk}
```
## [1] 30
```{r new_chunk_two}
y <- 100
lapply(seq(3), some_function)
```
## [[1]]
## [1] 101
##
## [[2]]
## [1] 102
##
## [[3]]
## [1] 103
```{r source_file_instead}
source("config.R")
```
## [1] 11
As you can see, if you were to source this file, there would be no way to modify the call to some_function prior to execution and the call would print an output of "11". Now that the chunks are available in the environment, they can be re-called any number of times (after for example, changing the value of y) or used any other way in the current environment (eg. new_chunk_two) which would not be possible with source if you didn't want the rest of the R script to execute.
In the shiny app that I'm creating I am attempting to add two lists together using the mapply function, however the program is displaying the following error whenever I attempt to run it.
ERROR: object 'SIMPLIFY' not found
Here is the code in question
addedUp <- mapply("+", a, b, SIMPLIFY = FALSE)
Simplify is clearly there with no typographical errors so I am totally at a loss for what may be causing this issue. I've tried googling the problem and have not found anything mentioning this specific error.
In the Documentation, the simplify argument is not capitalized. (Although it is capitalized when I look at help(mapply) in R. Maybe this works:
addedUp <- mapply("+", a, b, simplify=FALSE)
Alternatively, you can try:
Map('+',a,b)
Hope this helps.
I am trying to run an R script from the Windows command prompt (the reason is that later on I would like to run the script by using VBA).
After having set up the R environment variable (see the end of the post), the following lines of code saved in R_code.R work perfectly:
library('xlsx')
x <- cbind(rnorm(10),rnorm(10))
write.xlsx(x, 'C:/Temp/output.xlsx')
(in order to run the script and get the resulting xlsx output, I simply type the following command in the Windows command prompt: Rscript C:\Temp\R_code.R).
Now, given that this "toy example" is working as expected, I tried to move to my main goal, which is indeed very similar (to run a simple R script from the command line), but for some reason I cannot succeed.
Again I have to use a specific R package (-copula-, used to sample some correlated random variables) and export the R output into a csv file.
The following script (R_code2.R) works perfectly in R:
library('copula')
par_1 <- list(mean=0, sd=1)
par_2 <- list(mean=0, sd=1)
myCop.norm <- ellipCopula(family='normal', dim=2, dispstr='un', param=c(0.2))
myMvd <- mvdc(myCop.norm,margins=c('norm','norm'),paramMargins=list(par_1,par_2))
x <- rMvdc(10, myMvd)
write.table(x, 'C:/Temp/R_output.csv', row.names=FALSE, col.names=FALSE, sep=',')
Unfortunately, when I try to run the same script from the command prompt as before (Rscript C:\Temp\R_code2.R) I get the following error:
Error in FUN(c("norm", "norm"))[[1L]], ...) :
cannot find the function "existsFunction"
Calls: mvdc -> mvdcCheckM -> mvd.has.marF -> vapply -> FUN
Do you have any idea idea on how to proceed to fix the problem?
Any help is highly appreciated, S.
Setting up the R environment variable (Windows)
For those of you that want to replicate the code, in order to set up the environment variable you have to:
Right click on Computer -> Properties -> Advanced System Settings -> Environment variables
Double click on 'PATH' and add ; followed by the path to your Rscript.exe folder. In my case it is ;C:\Program Files\R\R-3.1.1\bin\x64.
This is a tricky one that has bitten me before. According to the documentation (?Rscript),
Rscript omits the methods package as it takes about 60% of the startup time.
So your better solution IMHO is to add library(methods) near the top of your script.
For those interested, I solved the problem by simply typing the following in the command prompt:
R CMD BATCH C:\Temp\R_code2.R
It is still not clear to me why the previous command does not work. Anyway, once again searching into the R documentation (see here) proves to be an excellent choice!