I'm building some packages with autoconf and automake, and would like to make sure libraries are dynamically linked (i.e. no static links).
How should one set up the autotools to force dynamic library linking?
Something like this comes to mind:
# Makefile.am
lib_LTLIBRARIES = libpart.la
libpart_la_SOURCES = lgpl_chunk.c
bin_PROGRAMS = prop
prop_SOURCES = prop.c
prop_LDADD = libpart.la
And make sure that you always build a shared library. Best by disabling static builds by default,
#configure.ac
AC_DISABLE_STATIC
if test "$enable_static" != "no"; then
echo "Sorry Dave, I can't let you do that";
exit 1;
fi;
You don't necessarily have to rely on autotools for this. You could use dlopen or some other facility to load the dynamic lib.
Related
So I’m trying to make a development environment that’s easily reproducible (staying away from home-manager currently to understand Nix better). After enough searching around I figured out how to make a few custom derivations, use buildEnv for package sets, and use ~/.config/nixpkgs/config.nix to do overrides. I’m working now to setup zsh and oh-my-zsh which have a ton of configuration options, but the only documentation I can find seems to suggest adding them to configuration.nix, which is a NixOS option I can’t use.
Currently my config.nix code looks something like this:
let
pkgs = import <nixpkgs> {};
in {
allowUnfree = true;
programs = {
zsh = {
enable = true;
promptInit = "source ${pkgs.zsh-powerlevel9k}/share/zsh-powerlevel9k/powerlevel9k.zsh-theme";
ohMyZsh = {
enable = true;
plugins = ["autojump"];
theme = "powerlevel9k/powerlevel9k";
};
};
};
packageOverrides = pkgs: with pkgs; rec {
all = buildEnv {
name = "all";
paths = with pkgs; [
tmuxinator
zsh
oh-my-zsh
autojump
...
];
};
};
}
My understanding so far is that within ~/.config/nixpkgs/config.nix, there should be a single config set which contains things like the overrides function and corresponds to documentation examples of config.programs.zsh.enable, etc. However, nothing I write in that programs section affects or causes a different ouput of any of my programs.
What am I missing? How can I affect the configuration options listed here (https://github.com/NixOS/nixpkgs/blob/master/nixos/modules/programs/zsh/zsh.nix)?
You seem to be trying to use home-manager's config without using home-manager itself. As you can see in the NixOS module you linked, this actually sets up /etc/zshrc etc, so it's not intended for use in a user-local config and won't do anything there. If you look at the corresponding home-manager module, you'll see that it basically reimplements the whole module for user-local purposes. So you won't get far with this approach without relying on home-manager.
I have got a 16-bit MPU which is different from x86_16 in size of size_t, ptrdiff_t etc. Can anybody give me details and clear instructions about how to customize machine dependency in Frama-C for my MPU?
There is currently no way to do that directly from the command line: you have to write a small OCaml script that will essentially define a new Cil_types.mach (a record containing the necessary information about your architecture) and register it through File.new_machdep. Assuming you have a file my_machdep.ml looking like that:
let my_machdep = {
Cil_types.sizeof_short = 2;
sizeof_int = 2;
sizeof_long = 4;
(* ... See `cil_types.mli` for the complete list of fields to define *)
}
let () = File.new_machdep "my_machdep" my_machdep
You will then be able to launch Frama-C that way to use the new machdep:
frama-c -load-script my_machdep.ml -machdep my_machdep [normal options]
If you want to have the new machdep permanently available, you can make it a Frama-C plugin. For that, you need a Makefile of the following form:
FRAMAC_SHARE:=$(shell frama-c -print-share-path)
PLUGIN_NAME=Custom_machdep
PLUGIN_CMO=my_machdep
include $(FRAMAC_SHARE)/Makefile.dynamic
my_machdep must be the name of your .ml file. Be sure to choose a different name for PLUGIN_NAME. Then, create an empty Custom_machdep.mli file (touch Custom_machdep.mli should do the trick). Afterwards, make && make install should compile and install the plug-in so that it will be automatically loaded by Frama-C. You can verify this by launching frama-c -machdep help that should output my_machdep among the list of known machdeps.
UPDATE
If you are using some headers from Frama-C's standard library, you will also have to update $(frama-c -print-share-path)/libc/__fc_machdep.h in order to define appropriate macros (related to limits.h and stdint.h mostly).
There's the command sbt flywayMigrate from flywaydb.org. The command requires use to set flywayUrl, flywayUser, and flywayPassword beforehand. It was good so far.
Now I want to be able to use sbt flywayMigrate for two different environment; Their variables should be different.
I tried to make two new commands: sbt flywayMigrateDev and sbt flywayMigrateProd. I couldn't figure out how to connect the new commands to flywayMigrate.
I tried creating a new scope. But I couldn't figure out how to wire the variables and tasks properly.
I wonder if anyone can give me an example on how to do this. I'd like to see a code example.
We can simplify the problem to:
There's the command sbt flywayMigrate that depends on flywayUrl. How do we allow the command to use different flywayUrls by calling sbt commands (or any other way is good, too)?
Thank you!
You should use config for this.
Example .sbt file contents:
// Set up your configs.
lazy val prodConfig = config("prod")
lazy val devConfig = config("dev")
// Set up any configuration that's common between dev and prod.
val commonFlyway = Seq(
// For the sake of example, a couple of shared settings.
flywayUser := "pg_admin",
flywayLocations := Seq("filesystem:migrations")
)
// Set up prod and dev.
inConfig(prodConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(prodConfig) := "jdbc:etc:proddb.somecompany.com"
// Or however you want to load your production password.
flywayPassword.in(prodConfig) := sys.env.getOrElse("PROD_PASSWD", "(unset)")
inConfig(devConfig)(flywayBaseSettings(prodConfig) ++ commonFlyway)
flywayUrl.in(devConfig) := "jdbc:etc:devdb.somecompany.com"
flywayPassword.in(devConfig) := "development_passwd"
Now you can run prod:flywayMigrate and dev:flywayMigrate to migrate production and development, respectively.
See the Flyway docs page for other examples.
I am using sbt with the sbt-revolver plugin and I want to clear the terminal screen (^L) when the project is recompiled (~ re-start). How can this be done?
You could define new command clear, which would use jline to clear the screen. Sbt is using jline internally so you shouldn't have to include any extra dependency.
build.sbt
def clearConsoleCommand = Command.command("clear") { state =>
val cr = new jline.console.ConsoleReader()
cr.clearScreen
state
}
val root = project.in(file(".")).settings(commands += clearConsoleCommand)
Now you could run your compile like this ~;clear;compile. This will trigger clear the console followed by compile on each file change (assuming this is what you want).
Another solution, based on #SethTisue answer:
alias clearScreen=eval "\u001B[2J\u001B[0\u003B0H"
This line should be added to ~/.sbtrc, so that sbt will have knowledge of a "clearScreen" command. You can either just invoke the command with ~;clearScreen;compile
Or make an additional alias like alias cc=~;clearScreen;compile
On Twitter, Paul Phillips suggested this method:
alias cc = ~ ;eval "\u001B[2J\u001B[0\u003B0H" ;compile
source: https://twitter.com/extempore2/status/403564233775775744
This specifically helps when you're doing something in continuous mode, ala `compile:
maxErrors := 5
triggeredMessage := Watched.clearWhenTriggered
This works as of 0.13.7. The second line clears the screen before each command runs. The first line limits the number of errors. With this config, you only ever have one screen full of errors to work through. Obviously could adjust maxErrors depending on your sbt window.
I'd like the ScalaDoc I generate with sbt to link to external libraries, and in sbt 0.13 we have autoAPIMappings which is supposed to add these links for libraries that declare their apiURL. In practice though, none of the libraries I use provide this in their pom/ivy metadata, and I suspect some of these libraries will never do so.
The apiMappings setting is supposed to help with just that, but it is typed as Map[File, URL] and hence geared towards setting doc urls for unmanaged dependencies. Managed dependencies are declared as instances of sbt.ModuleID and cannot be inserted directly in that map.
Can I somehow populate the apiMappings setting with something that will associate an URL with a managed dependency ?
A related question is: does sbt provide an idiomatic way of getting a File from a ModuleID? I guess I could try to evaluate some classpaths and get back Files to try and map them to ModuleIDs but I hope there is something simpler.
Note: this is related to https://stackoverflow.com/questions/18747265/sbt-scaladoc-configuration-for-the-standard-library/18747266, but that question differs by linking to the scaladoc for the standard library, for which there is a well known File scalaInstance.value.libraryJar, which is not the case in this instance.
I managed to get this working for referencing scalaz and play by doing the following:
apiMappings ++= {
val cp: Seq[Attributed[File]] = (fullClasspath in Compile).value
def findManagedDependency(organization: String, name: String): File = {
( for {
entry <- cp
module <- entry.get(moduleID.key)
if module.organization == organization
if module.name.startsWith(name)
jarFile = entry.data
} yield jarFile
).head
}
Map(
findManagedDependency("org.scalaz", "scalaz-core") -> url("https://scalazproject.ci.cloudbees.com/job/nightly_2.10/ws/target/scala-2.10/unidoc/")
, findManagedDependency("com.typesafe.play", "play-json") -> url("http://www.playframework.com/documentation/2.2.1/api/scala/")
)
}
YMMV of course.
The accepted answer is good, but it'll fail when assumptions about exact project dependencies don't hold. Here's a variation that might prove useful:
apiMappings ++= {
def mappingsFor(organization: String, names: List[String], location: String, revision: (String) => String = identity): Seq[(File, URL)] =
for {
entry: Attributed[File] <- (fullClasspath in Compile).value
module: ModuleID <- entry.get(moduleID.key)
if module.organization == organization
if names.exists(module.name.startsWith)
} yield entry.data -> url(location.format(revision(module.revision)))
val mappings: Seq[(File, URL)] =
mappingsFor("org.scala-lang", List("scala-library"), "http://scala-lang.org/api/%s/") ++
mappingsFor("com.typesafe.akka", List("akka-actor"), "http://doc.akka.io/api/akka/%s/") ++
mappingsFor("com.typesafe.play", List("play-iteratees", "play-json"), "http://playframework.com/documentation/%s/api/scala/index.html", _.replaceAll("[\\d]$", "x"))
mappings.toMap
}
(Including scala-library here is redundant, but useful for illustration purposes.)
If you perform mappings foreach println, you'll get output like (note that I don't have Akka in my dependencies):
(/Users/michaelahlers/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.7.jar,http://scala-lang.org/api/2.11.7/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-iteratees_2.11/jars/play-iteratees_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
(/Users/michaelahlers/.ivy2/cache/com.typesafe.play/play-json_2.11/jars/play-json_2.11-2.4.6.jar,http://playframework.com/documentation/2.4.x/api/scala/)
This approach:
Allows for none or many matches to the module identifier.
Concisely supports multiple modules to link the same documentation.
Or, with Nil provided to names, all modules for an organization.
Defers to the module as the version authority.
But lets you map over versions as needed.
As in the case with Play's libraries, where x is used for the patch number.
Those improvements allow you to create a separate SBT file (call it scaladocMappings.sbt) that can be maintained in a single location and easily copy and pasted into any project.
Alternatively to my last suggestion, the sbt-api-mappings plugin by ThoughtWorks shows a lot of promise. Long term, that's a far more sustainable route than each project maintaining its own set of mappings.