Undefined symbol error when trying to depend on RcppAmadillo - r

I am trying to depend on RcppArmadillo in my package but I get an error unable to load shared object /tmp/Rtmp0LswYZ/Rinst82cbed4eaee/00LOCK-alt.raster/00new/alt.raster/libs/alt.raster.so: undefined symbol: dsyev_ when I try to run the command R CMD build . in my package directory. However, following the instructions on https://stackoverflow.com/a/14165455 in an interactive R session works correctly. I have also run the R -e 'Rcpp::compileAttributes()' in my package directory and it seems to generate the RcppExports.cpp correctly. What am I doing wrong?

As surmised in the comments above, it is really beneficial to start from a working example.
To create one, we offer the RcppArmadillo.package.skeleton() function. Use it as follows:
edd#rob:/tmp$ Rscript -e 'RcppArmadillo::RcppArmadillo.package.skeleton("demoPkg")'
Calling kitten to create basic package.
Creating directories ...
Creating DESCRIPTION ...
Creating NAMESPACE ...
Creating Read-and-delete-me ...
Saving functions and data ...
Making help files ...
Done.
Further steps are described in './demoPkg/Read-and-delete-me'.
Adding pkgKitten overrides.
>> added .gitignore file
>> added .Rbuildignore file
Deleted 'Read-and-delete-me'.
Done.
Consider reading the documentation for all the packaging details.
A good start is the 'Writing R Extensions' manual.
And run 'R CMD check'. Run it frequently. And think of those kittens.
Adding RcppArmadillo settings
>> added Imports: Rcpp
>> added LinkingTo: Rcpp, RcppArmadillo
>> added useDynLib and importFrom directives to NAMESPACE
>> added Makevars file with Rcpp settings
>> added Makevars.win file with RcppArmadillo settings
>> added example src file using armadillo classes
>> added example Rd file for using armadillo classes
>> invoked Rcpp::compileAttributes to create wrappers
edd#rob:/tmp$
It should create these files:
edd#rob:/tmp$ tree demoPkg/
demoPkg/
├── DESCRIPTION
├── man
│   ├── demoPkg-package.Rd
│   ├── hello.Rd
│   └── rcpparma_hello_world.Rd
├── NAMESPACE
├── R
│   ├── hello.R
│   └── RcppExports.R
└── src
├── Makevars
├── Makevars.win
├── rcpparma_hello_world.cpp
└── RcppExports.cpp
3 directories, 11 files
edd#rob:/tmp$

Related

How to compile multiple simple projects with GNU make

I am trying to implement various project from a programming book. My intention was to have each project exercise in its own folder and then have a makefile that compiles all of them with something like a make all. The folder structure is like this:
.
├── Makefile
├── bin
│   ├── prog1
│ ├── prog2
│ └── prog3
└── src
├── prog1
│ ├── Makefile
│ └── main.c
├── prog2
│ ├── Makefile
│ └── main.c
└── prog3
├── Makefile
└── main.c
I would like to learn how to set up such a structure. In particular the part where the top makefile visit all folders in src calls make there, and then copies and renames the executable into the bin folders.
Your layout schematic shows a makefile for each exercise, plus the top-level makefile that you seem actually to be asking about. It would be best for the top-level makefile to avoid duplicating the behavior of the per-exercise makefiles, as such duplication would create an additional maintenance burden for you. Additionally, it is likely that you will eventually progress to exercises involving multiple source files, and perhaps to some that have multiple artifacts to be built. This is all the more reason for each per-exercise makefile to contain everything necessary to build the exercise with which it is associated (into the exercise-specific directory), and for the top-level makefile to depend on those.
Following that scheme would leave a well-defined role for the top-level makefile: to perform the per-exercise builds (by recursively running make), and to copy the resulting binaries to bin/. This is not the only way to set up a system of cooperating makefiles, but it is fairly easy, and that will allow you to focus on the exercises instead of on the build system.
Let us suppose, then, that each individual exercise can be built by changing to its directory and running make, with the result being an executable in the same directory, with the same name as the directory. That is, from the top-level directory, executing cd src/prog2; make would produce the wanted executable as src/prog2/prog2. In that case, the top-level makefile needs little more than the names of all the exercises, and a couple of rules:
EXERCISES = prog1 prog2 prog3
BINARIES = $(EXERCISES:%=bin/%)
all: $(BINARIES)
$(BINARIES):
make -C src/$$(basename $#)
cp src/$$(basename $#)/$$(basename $#) $#
Note: that uses a feature specific to GNU's implementation of make to compute the names of the wanted binaries from the exercise names. I take that to be acceptable, since you tagged [gnu-make], but in any case, it is a convenience feature, not a necessity.
There are different ways to tackle this, but something like this should work for your example:
PROGS := bin/prog1 bin/prog2 bin/prog3
all: $(PROGS)
$(PROGS):
$(MAKE) -C src/$(#F)
mkdir -p $(#D)
cp src/$(#F)/main $#
.PHONY: clean
clean:
rm -f $(PROGS)
for t in $(PROGS); do make -C src/`basename $$t` clean; done
We define a list of targets (PROGS) we are to build. We say these targets are prerequisites of all and then we go ahead and define how they should be built, that is: we recursively descent into src/ plus filename part of the target to run make there. We create directory of the target to be sure it's there and copy main from the directory we've descended to the path of the target.
For a good measure, there is a clean target as well that removes all the PROGS and runs make clean recursively in src/.

Is it possible to define packages within sbt's project directory

I defined some classes in sbt's project directory using no package (i.e. all my files were directly under project and they did not include any package statement). It worked fine.
Now when I tried to group them into packages and ran sbt reload I got not found: value XXX at the line I imported the package in my build.sbt (XXX is the name of the package).
Can't project deal with packages?
EDIT after comment
It will work if you add your source files in folder project/src/main/scala
Check this structure
tree
.
├── build.sbt
└── project
├── build.properties
└── src
└── main
└── scala
└── foo
└── Bar.scala
5 directories, 3 files
build.sbt
import foo._
version := Bar.ver
and Bar.scala
package foo
object Bar {
val ver = "1.0.0"
}

R CMD check note: Non-standard files/directories found at top level (PNG file) [duplicate]

I have a package with a README.Rmd that I pass to rmarkdown::render()
producing README.md and a directory README_files, which contains images in README.md. This looks like the tree below.
README_files is not a standard package directory, so if it isn't in .Rbuildignore, checking the package with R CMD check shows a note:
* checking top-level files ...
NOTE Non-standard file/directory found at top level: README_files
But including the directory in .Rbuildignore leads to a warning, if and only if checking the package --as-cran. IIUC Pandoc tries to generate HTML from README.md, but the images are unavailable, in the ignored README_files directory.
Conversion of ‘README.md’ failed:
pandoc: Could not fetch README_files/unnamed-chunk-14-1.png
README_files/unnamed-chunk-14-1.png: openBinaryFile: does not exist (No such file or directory)
Is there any way to get a clean check --as-cran here?
├── README_files
│   └── figure-markdown_github
│   ├── unnamed-chunk-14-1.png
│   ├── unnamed-chunk-15-1.png
│   ├── unnamed-chunk-16-1.png
│   ├── unnamed-chunk-26-1.png
│   └── unnamed-chunk-27-1.png
├── README.md
├── README.Rmd
The current preferred solution (at least as used by ggplot2) is to store the images in man/figures/. So in the README.Rmd file, include something like the following setup chunk.
```{r, echo = FALSE}
knitr::opts_chunk$set(
collapse = TRUE,
comment = "#>",
fig.path = "man/figures/README-"
)
```
That keeps the images tucked away in a place that won't generate cran check errors, but they are still part of the package. So you don't have to store them elsewhere or use calls to png::readPNG.
There are a few options. Update: I think Rob Hyndman's solution is now better than the things I list here.
Store the image online somewhere, and include the URL in the README.
As #Axeman noted, you can follow the ggplot2 approach of storing the images at the top level, and mentioning them in .Rbuildignore.
You can store them in inst/image, and use png::readPNG(system.file("image/yourpic.png", package = "yourpkg")) to read it. Then show it in the README using a plot.
I followed http://r-pkgs.had.co.nz/release.html. same error when putting them in top level and add to .Rbuildignore.
As Richie suggested, after adding images to inst/image, and refer to as ![](inst/image/README-unnamed-chunk-1-1.png)

How to tar all files in current directory using tar but without inputing names of tar file and all files?

ENV:
macOS Sierra 10.12.6
Raw input(example):
.
├── f1.md
├── f2.md
├── f3.md
├── f4.txt
├── f5.csv
└── f6.doc
0 directories, 6 files
In a test folder, there are 6 files.
Expected output:
.
├── all.tar
├── f1.md
├── f2.md
├── f3.md
├── f4.txt
├── f5.csv
└── f6.doc
0 directories, 7 files
Trying and Problem
tar -cvf all.tar f1.md f2.md f3.md f4.txt f5.csv f6.doc
Though I get the result from the above method but I have to inputing all file names and the compressed file name, which is inconvenient. For example , I can select all files and right click, then choose compressed option without inputing all.tar (I don't mind the .tar filenames.)
Hope
command-line method without inputing specific file names.
In case you want all files, including those in the subdirectories (or if you have no subdirectories), you would run:
tar -cvf all.tar *
Then, bash would expand * into the list of all files in the current directory, including subdirectories.
In case you want only those files in the current directory, but NOT in the subdirectories, then you would have to use find, in a more complicated command. Let me know if this is the case for you, and I can take the time to find that combination of commands for you.

Why does qmake fail to rm files correctly during rebuild with a custom compiler?

I am trying to get the thrift framework test samples to build in a Qt qmake environment and I'm seeing a problem when I rebuild my project.
My question is
What am I doing wrong that causes the makefile to concatenate all all the .out files together with no spaces in the rm call?'
I'll show the output first, then go into the specifics.
Error: - see the first 'rm' command rm -fthrifttest.... (every filename is run together??? Why?)
06:09:37: Running steps for project thrifttest...
06:09:37: Starting: "/usr/bin/make" clean
/opt/Qt/5.4/gcc/bin/qmake -spec linux-g++ CONFIG+=debug -o Makefile ../../thrifttest/idl/idl.pro
rm -fthrifttest/idl/tutorial.outthrifttest/idl/shared.out
rm -f Calculator.o shared_constants.o shared_types.o SharedService.o tutorial_constants.o tutorial_types.o
rm -f *~ core *.core
rm: invalid option -- 't'
Try 'rm --help' for more information.
make: [compiler_idl_clean] Error 1 (ignored)
06:09:37: The process "/usr/bin/make" exited normally.
06:09:37: Starting: "/opt/Qt/5.4/gcc/bin/qmake" /home/developer/dev/thrifttest/idl/idl.pro -r -spec linux-g++ CONFIG+=debug
06:09:37: The process "/opt/Qt/5.4/gcc/bin/qmake" exited normally.
06:09:37: Starting: "/usr/bin/make"
g++ -c -pipe -g -Wall -W -D_REENTRANT -fPIC -I../../thrifttest/idl -I. -I/opt/Qt/5.4/gcc/mkspecs/linux-g++ -o Calculator.o ../../thrifttest/idl/gen-cpp/Calculator.cpp
....
etc.
.pro file design - Qt version is 5.4
QT -= core
QT -= gui
TEMPLATE = lib
# the IDL files I am interested in.
THRIFTSOURCES = \
tutorial.thrift \
shared.thrift
#----------------------------------------------------------------------------------------
# a special tool (compiler) that generates the appropriate .cpp and .h files from the IDL generator
# called 'thrift'. There is a ton of undocumented nuggets in here
#
# IN_PWD — the base directory of the source tree
# QMAKE_FILE_BASE - the current processing filename without extension or path. Changes for each call to .commands
# QMAKE_FILE_IN - the current processing filename including its path and extension. Changes for each call to .commands
# $$escape_expand(\\n\\t) - a way to output more than one command (could have added '&&' as well)
#
# idl.output - sets the output file name that the step will generate. In this case I am creating
# a touched .out file to match against the .thrift file processed. This file is matched by
# timestamp against the current input file to determine if it needs building.
# idl.input - The tool will iterate through the specified files.
# idl.commands - this is the guts of the compiler tool. The commands listed here act upon files found
# in the .input list that need to be updated.
# idl.name - this appears to just be an internal name used in qmake;
# just ensure you use a different value for each custom compiler.
# idl.variable_out - the generated target files outputted from the build step will be added to this variable
# idl.CONFIG - target_predeps — I *think* this makes sure that the custom compiler
# is run as the first thing in the project...
# no_link - the files that are created should not be added to OBJECTS —
# i.e., they are not compiled code which should be linked
#----------------------------------------------------------------------------------------
idl.output = $${IN_PWD}/${QMAKE_FILE_BASE}.out
idl.input = THRIFTSOURCES
idl.commands = thrift -r -o "$${IN_PWD}" --gen cpp ${QMAKE_FILE_IN}$$escape_expand(\\n\\t) \
touch $${IN_PWD}/${QMAKE_FILE_BASE}.out$$escape_expand(\\n\\t)
idl.name = thrift-compiler
idl.variable_out = JUNK
idl.CONFIG = no_link target_predeps # guarantee this is called before the regular compiler actions.
QMAKE_EXTRA_COMPILERS += idl # add to the compiler action list.
#----------------------------------------------------------------------------------------
# We have no idea how many .cpp files will be generated by the IDL compiler. This
# set of lines finds all the .cpp files and then eliminates all the *.skeleton.cpp files
# (which are just sample applications and not needed by our library code).
# Once a good list is defined, then make it the SOURCES list to use as the build
# structure. The $$files(glob) is an UNDOCUMENTED function to handle wildcard searches.
# Hack. Hack. Hack.
#----------------------------------------------------------------------------------------
TESTS = $$files($${IN_PWD}/gen-cpp/*.cpp)
SKELS = $$files($${IN_PWD}/gen-cpp/*.skeleton.cpp)
TESTS -= $$SKELS
HEADERS = $$files($${IN_PWD}/gen-cpp/*.h)
SOURCES = $$TESTS
Source file directory structure
├── idl
│   ├── gen-cpp <--- generated files go in here
│   │   ├── Calculator.cpp
│   │   ├── Calculator.h
│   │   ├── Calculator_server.skeleton.cpp
│   │   ├── shared_constants.cpp
│   │   ├── shared_constants.h
│   │   ├── SharedService.cpp
│   │   ├── SharedService.h
│   │   ├── SharedService_server.skeleton.cpp
│   │   ├── shared_types.cpp
│   │   ├── shared_types.h
│   │   ├── tutorial_constants.cpp
│   │   ├── tutorial_constants.h
│   │   ├── tutorial_types.cpp
│   │   └── tutorial_types.h
│   ├── idl.pro
│   ├── shared.out
│   ├── shared.thrift
│   ├── tutorial.out
│   └── tutorial.thrift

Resources