We're trying to use an R-package called Pmetrics
This does normally not support a Linux environment but We'd like to perform some tests in a parallel setup. E.g. a distributed test with a 1000 runs distributed over 20 containers.
Using this Dockerfile you can reproduce the error.
FROM centos:centos7.5.1804
RUN yum -y install gcc-gfortran epel-release
RUN mkdir /root/pksim
RUN mkdir /root/pksim/Rlibraries
WORKDIR /root/pksim
RUN yum -y install R
COPY test.r test.r
COPY install.r install.r
the install.r script contains these lines
.libPaths("/root/pksim/Rlibraries")
install.packages("http://www.lapk.org/software/Pmetrics/Repos/src/contrib/Pmetrics_1.5.2.tar.gz", repos=NULL)
The test.r script contains these lines
r = getOption("repos")
r["CRAN"] = "http://lib.ugent.be/CRAN/"
options(repos = r)
rm(r)
.libPaths("/root/pksim/Rlibraries")
library(Pmetrics)
PMbuild()
You can make this work using these commands:
build the image: docker build -t pksim .
run the image: docker run -ti pksim /bin//bash. A Console should appear.
run the R install.r script (on the console): Rscript install.r
run the R test.r script (on the console): Rscript test.r It gets stuck when it calls PMBuild()
When executing test.r, the process is stuck in an endless loop requesting this user input.
Pmetrics needs to know which Fortran compiler you are using.
You only have to specify this once.
However, you can reconfigure if your compiler changes
by using the command PMFortranConfig(reconfig=T).
In each of the following <exec> is a place holder for the executable file name
and <files> is a placeholder for the files to be compiled. Both are required.
When applicable serial and parallel compile statements in Pmetrics are listed in that order.
1. gfortran -m64 -w -O3 -o <exec> <files>
gfortran -O3 -w -fopenmp -fmax-stack-var-size=32768 -o <exec> <files>
2. g95: g95 -o -fstatic <exec> <files>
3. Intel Visual: ifort -o <exec> <files>
4. Lahey: lf90 <files> -fix -out <exec>
5. Other (define custom command)
6. Help, I don't have a Fortran compiler!
Enter the number of your compiler:
Enter the number of your compiler:
Enter the number of your compiler:
Enter the number of your compiler:
Enter the number of your compiler:
Enter the number of your compiler:
Is there a way to configure the default compiler by setting some option somewhere before running PMBuild()?
We've created a GitHub repo containing the sources for this problem.
Can anyone think of a workaround?
A file ~/.config/Pmetrics/FortConfig.txt can be created with the contents:
gfortran -m64 -w -O3 -o <exec> <files>
gfortran -O3 -w -fopenmp -fmax-stack-var-size=32768 -o <exec> <files>
In case you want to use the 1 proposed compiler.
Related
There's probably a gazillion threads on OSX+Rcpp+openMP, but the bottom line right now appears to be this (per coatless):
Unfortunately, with R 4.0.0 the CRAN distributed version of R loses
the ability to use OpenMP without a custom setup.
I came across other ideas, including compiling llvm yourself, using homebrew or macports to install R and/or llvm and/or gcc, and then figuring out how to use the right compiler and/or flags with (R)cpp. However, I find this all very confusing.
I am not a mac user, but it seems to me that setting up a mac to compile Rcpp packages or code snippets with openMP seems to be too difficult for most mac users. However, I would like my R package on github to be used by more users, and since it relies on openMP, I am losing that audience.
Can someone provide the necessary steps to set up R on mac in a way that it can compile Rcpp code with openMP? I'd like to turn that into a quick tutorial.
EDIT: I should have added - on Apple Silicon, because there are some extra confusions where things go - /usr/local vs /opt
I spent a day figuring this out (original post here); here are the steps I used to compile R packages from source with openMP:
Install xcode from the app store (instructions for installing xcode) then install/reinstall the xcode command line tools from the terminal:
# To delete an existing command line tools installation:
sudo rm -rf /Library/Developer/CommandLineTools
# To install the command line tools
sudo xcode-select --install
Install gcc via Homebrew (instructions for installing Homebrew) or, if you already have gcc installed, skip to step 3.
# WARNING: This can take several hours
brew install gcc
To avoid "legacy" version issues:
brew cleanup
brew update
brew upgrade
brew reinstall gcc
Link some headers into /usr/local/include
sudo ln -s /Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include/* /usr/local/include/
# You can ignore warnings like this:
#ln: /usr/local/include//tcl.h: File exists
#ln: /usr/local/include//tclDecls.h: File exists
#ln: /usr/local/include//tclPlatDecls.h: File exists
#ln: /usr/local/include//tclTomMath.h: File exists
#ln: /usr/local/include//tclTomMathDecls.h: File exists
#ln: /usr/local/include//tk.h: File exists
#ln: /usr/local/include//tkDecls.h: File exists
#ln: /usr/local/include//tkPlatDecls.h: File exists
Check your version of gfortran (cd /usr/local/gfortran/lib/gcc/x86_64-apple-darwin19/; ls) then edit your ~/.R/Makevars file (if you don't have a file called Makevars in your ~/.R/ directory) and include only these lines:
LOC = /usr/local/gfortran
CC=$(LOC)/bin/gcc -fopenmp
CXX=$(LOC)/bin/g++ -fopenmp
CXX11 = $(LOC)/bin/g++ -fopenmp
CFLAGS=-g -O3 -Wall -pedantic -std=gnu99 -mtune=native -pipe
CXXFLAGS=-g -O3 -Wall -pedantic -std=c++11 -mtune=native -pipe
LDFLAGS=-L$(LOC)/lib -Wl,-rpath,$(LOC)/lib
CPPFLAGS=-I$(LOC)/include -I/Library/Developer/CommandLineTools/SDKs/MacOSX.sdk/usr/include
# (check that the version of gfortran - in this case 10.2.0 - matches the version specified in FLIBS)
FLIBS=-L/usr/local/gfortran/lib/gcc/x86_64-apple-darwin19/10.2.0 -L/usr/local/gfortran/lib -lgfortran -lquadmath -lm
CXX1X=/usr/local/gfortran/bin/g++
CXX98=/usr/local/gfortran/bin/g++
CXX11=/usr/local/gfortran/bin/g++
CXX14=/usr/local/gfortran/bin/g++
CXX17=/usr/local/gfortran/bin/g++
Open R and install a package to test that it compiles with openMP enabled (when asked, compile from source = "Yes"):
install.packages("data.table", type = "source")
Unfortunately, I do not believe a more "simple" setup exists.
Eventually, I found a process that works on a M1 mac with Big Sur.
Head over to https://mac.r-project.org/, it contains most things you will need
Download and install R via R-4.1-branch.pkg. The CRAN version might also work, but I used the installer from mac.r-project.org, which required opening the osx security settings to allow the installation.
Install RStudio, start it, and let it install the developer tools. Alternatively, run sudo xcode-select --install in Terminal.
Head to https://mac.r-project.org/openmp/. Download openmp-11.0.1-darwin20-Release.tar.gz and install it (see Terminal commands below).
curl -O https://mac.r-project.org/openmp/openmp-11.0.1-darwin20-Release.tar.gz
sudo tar fvx openmp-11.0.1-darwin20-Release.tar.gz -C /
Now we need to add compiler flags so that clan uses openMP. In Terminal, create the Makevars file.
cd ~
mkdir .R
nano .R/Makevars
in nano, paste these additional compiler flags into the Makevars file:
CPPFLAGS += -Xclang -fopenmp
LDFLAGS += -lomp
Hit Control+O, Control+X to save and close
Head over to the gfortran page: https://github.com/fxcoudert/gfortran-for-macOS/releases
Use the installer gfortran-ARM-11.0-BigSur.pkg to install gfortran.
For some reason it appears to install in /usr/local/gfortran, but R expects it in /opt. The mac-R team likes to separate arm64 and intel related files. We could go and fix paths, or simply also install gfortran under /opt. Download the tar file gfortran-ARM-11.0-BigSur.tar.xz. You can use curl, or just download it and point tar in the command line to it.
cd /opt/R/arm64/
sudo mkdir gfortran
sudo tar -xzyf gfortran-ARM-11.0-BigSur.tar.xz -C /opt/R/arm64/
(replace gfortran-ARM-11.0-BigSur.tar.xz with /users/YOURUSERNAME/downloads/gfortran-ARM-11.0-BigSur.tar.xz)
Now it should work.
Not an expert in OSX, but doing this so others can figure out how to use my R package. I'd like to streamline the process some more, but wiping the mac, reinstalling osx and testing it takes so much time.
I tried to use the rocksdb inside R package. I used the following src/Makevars:
CXX_STD = CXX11
PKG_CPPFLAGS = -I./rocksdb/include/
PKG_LIBS = rocksdb/librocksdb.a -lbz2 -lz -lzstd -llz4 -lsnappy
$(SHLIB): rocksdb/librocksdb.a
rocksdb/librocksdb.a: rocksdb/Makefile
CFLAGS="$(CFLAGS) $(CPICFLAGS)" AR="$(AR)" RANLIB="$(RANLIB)" LDFLAGS="$(LDFLAGS)" \
$(MAKE) -d --jobs=3 --directory=rocksdb static_lib
clean:
$(MAKE) --directory=rocksdb clean
Package installation failed with many errors (see build log below).
You could reproduce this case using Docker container:
Necessary commands:
docker run --rm -ti rocker/r-ver:latest bash
Execute in container:
apt-get update
# install system deps
apt-get install -y libgflags-dev libsnappy-dev zlib1g-dev libbz2-dev liblz4-dev libzstd-dev
apt-get install -y git-core
# install R deps
install2.r Rcpp checkmate R6 tinytest
cd /tmp
git clone https://gitlab.com/artemklevtsov/rocksdb
cd rocksdb/
git submodule init
git submodule update
R CMD INSTALL .
But I can successfully run make directly in the rocksdb source directory:
cd src/rocksdb/
make static_lib
How can I fix src/Makevars to build rocksdb during R package installation?
Links:
build log
rocksdb install guide
R package source repo
Not a full answer but an observation (for now):
I tried to reproduce this in a docker container. The R package build failed, but also the plain build when using the same flags as used by R, but without parallel jobs and make's debug output:
root#e8749c4bca63:/tmp/rocksdb/src# CFLAGS="-g -O2 -fstack-protector-strong -Wformat -Werror=format-security -Wdate-time -D_FORTIFY_SOURCE=2 -g -fpic" AR="ar" RANLIB="ranlib" LDFLAGS="-L/usr/local/lib" \
> make --directory=rocksdb static_lib
[...]
CC util/bloom.o
CC util/build_version.o
util/build_version.cc:5:42: error: macro "__DATE__" might prevent reproducible builds [-Werror=date-time]
const char* rocksdb_build_compile_date = __DATE__;
^~~~~~~~
cc1plus: all warnings being treated as errors
Makefile:2029: recipe for target 'util/build_version.o' failed
make: *** [util/build_version.o] Error 1
make: Leaving directory '/tmp/rocksdb/src/rocksdb'
So it looks as if -Wdate-time got promoted to -Werror=date-time.
To solve the problem we should reset MAKEFLAGS variable. So the right way to do it:
rocksdb/librocksdb.a: rocksdb/Makefile
CFLAGS="$(CCFLAGS) $(CPICFLAGS)" MAKEFLAGS="" \
$(MAKE) -C rocksdb DISABLE_WARNING_AS_ERROR=1 static_lib
MAKEFLAGS content:
MAKEFLAGS= -- OBJECTS=RcppExports.o\ backup.o\ checkpoint.o\ db.o\ del.o\ exists.o\ get.o\ keys.o\ list.o\ options.o\ property.o\ put.o\ size.o\ sst.o\ utils.o\ version.o\ wrap.o SHLIB=rocksdb.so SHLIB_LD=$$(SHLIB_CXX11LD) SHLIB_LDFLAGS=$$(SHLIB_CXX11LDFLAGS) CXXPICFLAGS=$$(CXX11PICFLAGS) CXXFLAGS=$$(CXX11FLAGS) CXX=$$(CXX11)\ $$(CXX11STD)
I'm trying to install python package airflow into a virtualenv that has been created using pipenv, inside a docker container. It fails with an error that I'm clueless about.
Here is my Dockerfile:
FROM python:3.6-stretch
WORKDIR /tmp
# Define build args
ARG http_proxy
ARG https_proxy
ARG no_proxy
RUN apt-get update && \
apt-get -y install default-jdk
# Detect JAVA_HOME and export in bashrc.
# This will result in something like this being added to /etc/bash.bashrc
# export JAVA_HOME=/usr/lib/jvm/java-8-openjdk-amd64
RUN echo export JAVA_HOME="$(readlink -f /usr/bin/java | sed "s:/jre/bin/java::")" >> /etc/bash.bashrc
# Upgrade pip
RUN pip install --upgrade pip
# Install core python packages
RUN pip install pipenv==2018.5.18
Build and run:
docker build -t pipenvtest:latest .
docker run -it pipenvtest:latest bash
When connected to the container:
pipenv --python 2.7
pipenv install --dev airflow
Which fails with this error:
building '_yaml' extension
creating build/temp.linux-x86_64-2.7/ext
x86_64-linux-gnu-gcc -pthread -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fno-strict-aliasing -Wdate-time -D_FORTIFY_SOURCE=2 -g -fdebug-prefix-map=/build/python2.7-2.7.13=. -fstack-protector-strong -Wformat -Werror=format-security -fPIC -I/usr/include/python2.7 -c ext/_yaml.c -o build/temp.linux-x86_64-2.7/ext/_yaml.o
ext/_yaml.c:4:20: fatal error: Python.h: No such file or directory
#include "Python.h"
^
compilation terminated.
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
(the ^ actually appears at the end of the line preceding it but I don't know how to format the quoted text as such)
I admit to not having the faintest idea how to go about solving this so hoping someone can give me some pointers. I hope the repro that I've included here works for you.
Is the --dev switch in pipenv install --dev airflow intended? It instructs pipenv to install development dependencies of Airflow too. One of these dependencies needs the Python.h header file (which is missing). To resolve the problem:
If you do not need the development dependencies then
remove the --dev switch.
If you need the development
dependencies then install the libpython2.7-dev package, which
provides Pthon.h, before you install Aiflow: apt install libpython2.7-dev
OK, I was being really dumb. I was trying to setup a python2.7 virtualenv on an image built from python:3.6-stretch.
I changed
pipenv --python 2.7
to
pipenv --python 3.6
and it worked.
I've tried to solve this using the previous questions/answers on SO but without any success. So, here's my problem.
I'm using RStudio on and Ubuntu box (14.04) and I tried to upgrade rJava from sources and in the process I managed to lose it.
I tried to install it again using,
install.packages("rJava")
which returned the following error message,
configure: error: One or more Java configuration variables are not set.
Make sure R is configured with full Java support (including JDK). Run
R CMD javareconf
as root to add Java support to R.
If you don't have root privileges, run
R CMD javareconf -e
to set all Java-related variables and then install rJava.
ERROR: configuration failed for package ‘rJava’
* removing ‘/home/darren/R/x86_64-pc-linux-gnu-library/3.2/rJava’
Warning in install.packages :
installation of package ‘rJava’ had non-zero exit status
So, I went to the terminal and typed,
sudo R CMD javareconf
which also gave the following error,
trying to compile and link a JNI program
detected JNI cpp flags :
detected JNI linker flags : -L/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server -ljvm
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -fpic -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Wformat-security -Werror=format-security -D_FORTIFY_SOURCE=2 -g -c conftest.c -o conftest.o
conftest.c:1:17: fatal error: jni.h: No such file or directory
#include <jni.h>
^
compilation terminated.
make: *** [conftest.o] Error 1
Unable to compile a JNI program
JAVA_HOME : /usr/lib/jvm/default-java
Java library path:
JNI cpp flags :
JNI linker flags :
Updating Java configuration in /usr/lib/R
Done.
I tried to follow these links, one and two but they didn't seem to resolve my issue; there are more links on SO but I'm not sure which one to follow. I've also un-installed and re-installed RStudio via the Ubuntu Software Centre but this didn't make any difference.
Can anyone else help?
In short, I want to be able to use RStudio with rJava again without it destroying any other uses of Java (such as jmol).
You don't seem to have JDK installed. You will need at least
sudo apt-get install openjdk-7-jdk
then re-run
sudo R CMD javareconf
Make sure you do NOT set JAVA_HOME by hand - it will be detected automatically. You should then see something like this:
$ sudo R CMD javareconf
Java interpreter : /usr/bin/java
Java version : 1.7.0_91
Java home path : /usr/lib/jvm/java-7-openjdk-amd64/jre
Java compiler : /usr/bin/javac
Java headers gen.: /usr/bin/javah
Java archive tool: /usr/bin/jar
trying to compile and link a JNI program
detected JNI cpp flags : -I$(JAVA_HOME)/../include
detected JNI linker flags : -L$(JAVA_HOME)/lib/amd64/server -ljvm
gcc -std=gnu99 -I/usr/share/R/include -DNDEBUG -I/usr/lib/jvm/java-7-openjdk-amd64/jre/../include -fpic -g -O2 -fstack-protector --param=ssp-buffer-size=4 -Wformat -Werror=format-security -D_FORTIFY_SOURCE=2 -g -c conftest.c -o conftest.o
gcc -std=gnu99 -shared -L/usr/lib/R/lib -Wl,-Bsymbolic-functions -Wl,-z,relro -o conftest.so conftest.o -L/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server -ljvm -L/usr/lib/R/lib -lR
JAVA_HOME : /usr/lib/jvm/java-7-openjdk-amd64/jre
Java library path: $(JAVA_HOME)/lib/amd64/server
JNI cpp flags : -I$(JAVA_HOME)/../include
JNI linker flags : -L$(JAVA_HOME)/lib/amd64/server -ljvm
What is wrong with sudo apt-get install r-cran-rjava ?
See for example this earlier answer and the question / thread around it.
For an installation from scratch, you could still much worse than starting from sudo apt-get build-dep r-cran-rjava. It will get you the JDK corresponding to your Ubuntu version.
First i would recommend installing Rstudio from its website: https://www.rstudio.com/products/rstudio/download/ (i.e. Rstudio 64bit: https://download1.rstudio.org/rstudio-0.99.489-amd64.deb). This does not solve the problem directly, but it helps to avoid other bugs with Rstudio.
Regarding the error, trying to make sure you have JDK install. I don't think the command java -version can tell if JDK is installed. You have to check the package of JDK itself, or based on the error message, do this:
locate jni.h
The output should match or compatible with your JAVAHOME, e.g:
/usr/lib/jvm/java-7-openjdk-amd64/include/jni.h
/usr/lib/jvm/java-7-oracle/include/jni.h
Update 1:
R CMD javareconf is looking for the jni.h file under $(JAVA_HOME)/include
You have JDK installed, but it is very likely that you are having default java to a JRE directory, that why the error happened.
You can see where default-java is really pointing to by doing this command:
jRealDir=$(readlink -f /usr/lib/jvm/default-java)
echo $jRealDir
# sample correct output: /usr/lib/jvm/jdk1.8.0_65
# or /usr/lib/jvm/java-8-oracle if you default to Oracle's
# now check jni.h
ls -l $jRealDir/include/jni.h
# sample expected output:
# /usr/lib/jvm/jdk1.8.0_65/include/jni.h
If the ls command failed, you have to setup so that javareconf ( and later rJava) can use the java from JDK not from JRE. You have two options:
Method 1: Do it system-wide
This is convenient, but may effect other program like the one you mentioned jmol. But don't worry, this is revertible, just re-run the command and pick the old one. Do the following command and pick the dir that has JDK:
sudo update-alternatives --config java
After that test how jmol works, if it works alright then congrat. You are now ready to test rJava. If not, try the second method below
Method 2: Do it for R only
put this in the .Rprofile under your home directory
Sys.setenv(JAVA_HOME = '/usr/lib/jvm/jdk1.8.0_65')
# this set JAVA_HOME for R to correct java home dir.
After updating or creating the .Rprofile DO restart R in Rstudio. The R CMD javareconf may still fail in this case, but it should be OK if you run it from Shell under Tools menu of Rstudio.
Regarding the installing or Rstudio from Ubuntu's stock repo. It would not make a difference for getting rJava running. Then again, I recommend installing Rstudio for its homepage because new version also has some nice features (i.e. better autocompletion, which I like the most).
Here is link on R-Bloggers that worked for me: https://www.r-bloggers.com/installing-rjava-on-ubuntu/
sudo apt-get install -y default-jre
sudo apt-get install -y default-jdk
sudo R CMD javareconf
install.packages("rJava")
I've been dealing with this exact issue, nothing in this thread or other that are similar have solved it. I'm on Ubuntu 16.04, here's how I got it to work:
apt-get install openjdk-9-jdk
rm -rf /usr/lib/jvm/default-java
ln -s /usr/lib/jvm/java-9-openjdk-amd64/ /usr/lib/jvm/default-java
You can see where the JAVA_HOME is in the error message.
Then use locate jni.h to find where is jni.h, next use soft link to link this location to $(JAVA_HOME)/include, just like #biocyberman mentioned.
This is what I did:
ln -s /usr/lib/jvm/java-8-openjdk-amd64/include/jni.h /opt/conda/include/jni.h
ln -s /usr/lib/jvm/java-8-openjdk-amd64/include/linux/jni_md.h /opt/conda/include/jni_md.h
ln -s /usr/lib/jvm/java-8-openjdk-amd64/jre/lib/amd64/server/libjvm.so /usr/lib/
Since my JAVA_HOME is /opt/conda and I also don't have jni_md.h and -ljvm.
I am use Ubuntu 16.04.
I was looking at the verbose=TRUE when I tried to sourceCpp a Rcpp file. The last output is:
DIR: C:/Users/xyz/AppData/Local/Temp/RtmpmielLn/sourcecpp_226416891d0e
C:/PROGRA~1/R/R-31~1.0/bin/x64/R CMD SHLIB -o "sourceCpp_22129.dll" --preclean "myfile.cpp"
g++ -m64 -I"C:/PROGRA~1/R/R-31~1.0/include" -DNDEBUG -I"C:/Users/xyz/Documents/R/win-library/3.1/Rcpp/include" -I"d:/RCompile/CRANpkg/extralibs64/local/include" -O2 -Wall -mtune=core2 -c myfile.cpp -o myfile.o
g++ -m64 -shared -s -static-libgcc -o sourceCpp_22129.dll tmp.def myfile.o -Ld:/RCompile/CRANpkg/extralibs64/local/lib/x64 -Ld:/RCompile/CRANpkg/extralibs64/local/lib -LC:/PROGRA~1/R/R-31~1.0/bin/x64 -lR
I have a few questions regarding this:
the 1nd g++ command refers to -I"d:/RCompile/CRANpkg/extralibs64/local/include" and the 2nd command refers to -Ld:/RCompile/CRANpkg/extralibs64/local/lib/x64 and -Ld:/RCompile/CRANpkg/extralibs64/local/lib . But I don't have a D: drive, or a RCompile folder anywhere. What do these things refer to?
I tried to manually run the 1st g++ which ran file and created myfun.o file, but when I tried to manually run the 2nd g++ it gave me an error saying that it couldn't find the tmp.def file. I couldn't find the tmp.def file anywhere on my drives. Where would this tmp.def file located?
I looked under the hood of sourceCpp function. if I directly run the definition of cmd in the sourceCpp function: C:/PROGRA~1/R/R-31~1.0/bin/x64/R CMD SHLIB -o "sourceCpp_22129.dll" --preclean "myfile.cpp" on Windows' command window , I noticed that it does not include -I"C:/Users/xyz/Documents/R/win-library/3.1/Rcpp/include" and the R CMD SHLIB gives me an error.
How does the system(cmd, ..) within the sourceCpp function include this? The value of the cmd variable in the sourceCpp didn't include -I"C:/Users/xyz/Documents/R/win-library/3.1/Rcpp/include"
the 1nd g++ command refers to -I"d:/RCompile/CRANpkg/extralibs64/local/include" and the 2nd command refers to -Ld:/RCompile/CRANpkg/extralibs64/local/lib/x64 and -Ld:/RCompile/CRANpkg/extralibs64/local/lib . But I don't have a D: drive, or a RCompile folder anywhere. What do these things refer to?
AFAIK these are left in as part of the CRAN R Windows distribution; when R binaries are built on Windows they use something in these library paths on the build servers (but stay baked into R anyhow). You can safely ignore it, but it is a bit odd. Unused / non-existent directories passed through gcc / g++ are just ignored.
I tried to manually run the 1st g++ which ran file and created myfun.o file, but when I tried to manually run the 2nd g++ it gave me an error saying that it couldn't find the tmp.def file. I couldn't find the tmp.def file anywhere on my drives. Where would this tmp.def file located?
tmp.def, as it sounds, is a temporary definition file created by R CMD SHLIB on Windows. If you just re-run what you see it does not get generated, so I suppose R does something behind the curtains to generate it. If you are curious about where it's generated, see share/make/winshlib.mk in the R sources.
I looked under the hood of sourceCpp function. if I directly run the definition of cmd in the sourceCpp function: C:/PROGRA~1/R/R-31~1.0/bin/x64/R CMD SHLIB -o "sourceCpp_22129.dll" --preclean "myfile.cpp" on Windows' command window , I noticed that it does not include -I"C:/Users/xyz/Documents/R/win-library/3.1/Rcpp/include" and the R CMD SHLIB gives me an error.
This is because sourceCpp is setting the appropriate environment flags behind the scenes for you as well -- in this case, the CXXFLAGS environment variable. This gets automatically done on package installs as well when the LinkingTo: entry is specified in the DESCRIPTION file.
I had a very similar g++ compilation command as posted in the question, and the compilation of a very simple C function didn't work for me.
The reason it didn't work is the following option shown in the g++ command:
-I"d:/RCompile/CRANpkg/extralibs64/local/include"
which adds an include directory on a drive that does not exist on my computer. Apparently non-existent directories listed in the -I option are ignored by g++ (as stated by Kevin Ushey) but this seems not to be the case for non-existent drives.
The error message I received was that the sdtlib.h header file was not found:
C:/PROGRA~1/R/R-32~1.0/include/R.h:28:20: fatal error:
d:/RCompile/r-compiling/local/local320/include/stdlib.h: Input/output
error #include
In order to remove the offending -I option from the g++ command I had to edit the Makeconf file located in $(R_HOME}/etc$(R_ARCH) (in my case C:\Program Files\R\R-3.2.0\etc\x64) and comment out the line:
LOCAL_SOFT = d:/RCompile/r-compiling/local/local320