Rebuilding installed Ada library - build-process

So, I've successfully built several shared libraries from Ada projects. But, when doing updates, I ran into a problem that is also experienced with some GPR managed projects. That is, if the source files are installed in the default ada search paths (which they have to be, to be meaningfully installed), the compiler tries to rebuild the installed files, notices no change, and does nothing, instead of rebuilding the local, updated, files. I know it's not just on my system, because right in the XML/Ada README, it specifically states that rebuilding if there is already a prior version of XML/Ada installed, results in not all packages being rebuilt as they should. Trying to do exactly this results in the same problem I'm experiencing with certain packages.
For example, I have a mathematics package Mathematics. I build it as a shared library, and install it. I then have a child package Mathematics.Angles. I build this as a separate shared library, it finds Mathematics, builds fine, and links to it. Both libraries test fine. I then do some updates to Mathematics, attempt to rebuild it, and the aforementioned problem occurs: no building is deemed necessary, and the linker then finds no object files in the local directory to create a shared library from.
Is this a deficiency in the toolchain, or an error in my part somewhere? Is there a solution/workaround? Uninstalling everything just to update a base library is not a valid workaround.
edit: compiled as follows:
gnatmake -O2 -gnatf -gnato $source -cargs -fPIC
Away from computer right now, but the gnat.adc lists roughly the following:
--naming switches, no actual changes from GNAT default though.
pragma Enable_Assertions(Check);
pragma License(Unrestricted);
pragma Warnings(On);
pragma Wide_Character_Encoding(UTF8);

Related

R: prevent source re-compilation using devtools::install

I am in the process of developing an R package with external source code that takes a long time to compile. While compilation time isn't a big problem for a one-off installation, I have to routinely reinstall the package to test new additions. Is it possible to prevent re-compiling the source code if there haven't been any changes to it?
I don't necessarily need this to be automated, but I can't figure out a manual solution either. As my source code is in Rust, the following serves as the most representative example I have (note that it requires Rust cargo to be installed):
git clone https://github.com/r-rust/hellorust
Rscript -e "devtools::install('hellorust', quick = TRUE)"
When I run the above, I see that the hellorust.so file has been created in the src directory, but how do I make devtools::install() use this file rather than recompile everything? It doesn't seem like devtools::install(quick = TRUE) or devtools::install(build = FALSE) are meant for this...
Alternatively, is it possible to achieve the desired behavior on the Rust side of things? I don't understand why cargo would recompile everything if there haven't been any changes and the target directory is still there. That said, I'm quite new to Rust and compiled languages in general so my understanding of the broader concepts involved here is unfortunately quite limited...
I would also be interested to learn if there is a better way to test R packages during development than manually reinstalling them.
Based on the comments by r2evans, the final answer seems to be that this isn't what devtools::install is for.
As per the devtools documentation, there are three main tools for "frequent development tasks":
load_all
document
test
Of these load_all "simulates installing and reloading your package, loading R code in R/, compiled shared objects in src/ and data files in data/". By default, load_all() will not recompile source code in src/ (unless the recompile flag is set to true).
So the answer is to use load_all as opposed to install during package development and manually control when to compile the source code using something like devtools::compile_dll.

When should a Julia project have a Manifest AND Project file?

I am trying to understand when a Julia Project needs a Manifest AND Project file vs when it just needs a project file. What are the different situations that warrant each case? I am trying to make sure my own project is set up correctly(It has both files currently).
The Manifest.toml is a snapshot of the exact state of a Julia environment. It specifies all packages that are installed in the environment with version numbers - not just the ones that have been ] added but the entire dependency graph!
The Project.toml on the other hand just lists the direct dependencies, that is the packages that have been ] added explicitly, potentially with version bounds specified in a [compat] section.
By checking in both files (specifically the Manifest.toml), you make your project reproducible. Another user only has to ] instantiate and will have the exact same environment that you had when working on the project. This is great for application projects which might consist of multiple Julia scripts which are not intended for use by other Julia projects.
If you only check in the Project.toml you are specifying the dependency information more loosely and will leave room for Julias resolver to find appropriate package versions for all dependencies. This is what you should do when working on a Julia package since people will likely want to install your package next to other packages and overly restricting versions of dependencies will make your package incompatible.
Hence, I'd summarize as follows:
Application / "Project" -> Project.toml + Manifest.toml
Julia Package -> Only Project.toml
For more on applications and packages checkout the glossary of the Pkg.jl documentation.
(Note that there are exceptional cases (unregistered dependencies, for example) where you might have to check in a Manifest.toml for a Julia package.)
In Julia 1.2 and above, you can have nested Project.toml files to express test-specific dependencies. Since you may have a Project.toml in your test folder, which you would need to activate, I would also suggest including a Manifest.toml, as a record of under which environment you for-sure know your package's test are passing.
In other words, I believe in the package/application dichotomy mentioned in crstnbr's answer, and the recommendation to include Manifest.toml with applications, and I would further say that the tests within a package are like an application. The same goes for performance benchmarks that you might have in your package.
I haven't practiced this myself, but it seems like it would be nice to have the CI tests run under both the "frozen" version of the test/Manifest.toml, and the latest versions that the package manager can find of each package. If the tests start failing, it would be easier to tease apart whether the breakage is caused by a change in a dependency.

Codelite compiling issues (mingw32-make.exe *** [ALL] Error 2)

I'm having serious issues with Codelight. It has been working for days now, maybe even weeks but after today when I took my project to school to work on it there something happened. My workspace is in a onedrive folder so that I can work on it wherever I am. I have reinstalled codelight and reinstalled MinGW and set it up according to my school's instructions but right now I can't build anything at all (see attached image). I have been looking at other threads but none of them have helped so far. Error
What do you think happened?
Edit: I seem to have fixed the issue. When you let codelite search for a compiler, as it does the first time you launch it, you mess up the directories of things completely. So for example the directory for the C compiler should be $(CodeLiteDir)/tools/gcc-arm/bin/arm-none-eabi-gcc.exe instead of C:\MinGW or wherever it may be installed. Also, we use a patched version of codelite with 'added debugging support' for the md407 so you really don't want to update codelite. There were more issues, for example the C compiler options for my project, so when I built the project it complained about all sorts of things and the cursor wouldn't show up so debugging was impossible, but I managed to fix that too.
In conclusion: this was not fun to fix and codelite is sensitive.
I use Dev-C++ I got similiar 'Mingw32-make.exe' errors. When installing Mingw you will notice there is another directory 'c:\Mingw32\MSYS\1.0\bin'. Within MSYS this directory is global and it has some very important binary files like its own 'make.exe' file. 'Mingw32-make.exe' uses files from this directory. Because the IDE will not know about this directory you will need to include this in your system/environment path because outside of MSYS this directory is not global and 'ming32-make.exe' will not be able to access those binary files.
Regardless of your compiler if your 'make' is Mingw32 that path must be set.

Compiling haskell module Network on win32/cygwin

I am trying to compile Network.HTTP (http://hackage.haskell.org/package/network) on win32/cygwin. However, it does fail with following message:
Setup.hs: Missing dependency on a foreign library:
* Missing (or bad) header file: HsNet.h
This problem can usually be solved by installing the system package that
provides this library (you may need the "-dev" version). If the library is
already installed but in a non-standard location then you can use the flags
--extra-include-dirs= and --extra-lib-dirs= to specify where it is.
If the header file does exist, it may contain errors that are caught by the C
compiler at the preprocessing stage. In this case you can re-run configure
with the verbosity flag -v3 to see the error messages.
Unfortuntely it does not give more clues. The HsNet.h includes sys/uio.h which, actually should not be included, and should be configurered correctly.
Don't use cygwin, instead follow Johan Tibells way
Installing MSYS
Install the latest Haskell Platform. Use the default settings.
Download version 1.0.11 of MSYS. You'll need the following files:
MSYS-1.0.11.exe
msysDTK-1.0.1.exe
msysCORE-1.0.11-bin.tar.gz
The files are all hosted on haskell.org as they're quite hard to find in the official MinGW/MSYS repo.
Run MSYS-1.0.11.exe followed by msysDTK-1.0.1.exe. The former asks you if you want to run a normalization step. You can skip that.
Unpack msysCORE-1.0.11-bin.tar.gz into C:\msys\1.0. Note that you can't do that using an MSYS shell, because you can't overwrite the files in use, so make a copy of C:\msys\1.0, unpack it there, and then rename the copy back to C:\msys\1.0.
Add C:\Program Files\Haskell Platform\VERSION\mingw\bin to your PATH. This is neccesary if you ever want to build packages that use a configure script, like network, as configure scripts need access to a C compiler.
These steps are what Tibell uses to compile the Network package for win and I have used this myself successfully several times on most of the haskell platform releases.
It is possible to build network on win32/cygwin. And the above steps, though useful (by Jonke) may not be necessary.
While doing the configuration step, specify
runghc Setup.hs configure --configure-option="--build=mingw32"
So that the library is configured for mingw32, else you will get link or "undefined references" if you try to link or use network library.
This combined with #Yogesh Sajanikar's answer made it work for me (on win64/cygwin):
Make sure the gcc on your path is NOT the Mingw/Cygwin one, but the
C:\ghc\ghc-6.12.1\mingw\bin\gcc.exe
(Run
export PATH="/cygdrive/.../ghc-7.8.2/mingw/bin:$PATH"
before running cabal install network in the Cygwin shell)

Dependency management in R

Does R have a dependency management tool to facilitate project-specific dependencies? I'm looking for something akin to Java's maven, Ruby's bundler, Python's virtualenv, Node's npm, etc.
I'm aware of the "Depends" clause in the DESCRIPTION file, as well as the R_LIBS facility, but these don't seem to work in concert to provide a solution to some very common workflows.
I'd essentially like to be able to check out a project and run a single command to build and test the project. The command should install any required packages into a project-specific library without affecting the global R installation. E.g.:
my_project/.Rlibs/*
Unfortunately, Depends: within the DESCRIPTION: file is all you get for the following reasons:
R itself is reasonably cross-platform, but that means we need this to work across platforms and OSs
Encoding Depends: beyond R packages requires encoding the Depends in a portable manner across operating systems---good luck encoding even something simple such as 'a PNG graphics library' in a way that can be resolved unambiguously across systems
Windows does not have a package manager
AFAIK OS X does not have a package manager that mixes what Apple ships and what other Open Source projects provide
Even among Linux distributions, you do not get consistency: just take RStudio as an example which comes in two packages (which all provide their dependencies!) for RedHat/Fedora and Debian/Ubuntu
This is a hard problem.
The packrat package is precisely meant to achieve the following:
install any required packages into a project-specific library without affecting the global R installation
It allows installing different versions of the same packages in different project-local package libraries.
I am adding this answer even though this question is 5 years old, because this solution apparently didn't exist yet at the time the question was asked (as far as I can tell, packrat first appeared on CRAN in 2014).
Update (November 2019)
The new R package renv replaced packrat.
As a stop-gap, I've written a new rbundler package. It installs project dependencies into a project-specific subdirectory (e.g. <PROJECT>/.Rbundle), allowing the user to avoid using global libraries.
rbundler on Github
rbundler on CRAN
We've been using rbundler at Opower for a few months now and have seen a huge improvement in developer workflow, testability, and maintainability of internal packages. Combined with our internal package repository, we have been able to stabilize development of a dozen or so packages for use in production applications.
A common workflow:
Check out a project from github
cd into the project directory
Fire up R
From the R console:
library(rbundler)
bundle('.')
All dependencies will be installed into ./.Rbundle, and an .Renviron file will be created with the following contents:
R_LIBS_USER='.Rbundle'
Any R operations run from within this project directory will adhere to the project-speciic library and package dependencies. Note that, while this method uses the package DESCRIPTION to define dependencies, it needn't have an actual package structure. Thus, rbundler becomes a general tool for managing an R project, whether it be a simple script or a full-blown package.
You could use the following workflow:
1) create a script file, which contains everything you want to setup and store it in your projectd directory as e.g. projectInit.R
2) source this script from your .Rprofile (or any other file executed by R at startup) with a try statement
try(source("./projectInit.R"), silent=TRUE)
This will guarantee that even when no projectInit.R is found, R starts without error message
3) if you start R in your project directory, the projectInit.R file will be sourced if present in the directory and you are ready to go
This is from a Linux perspective, but should work in the same way under windows and Mac as well.

Resources