How to set version number in sbt when building & publishing akka-http - sbt

I'm trying to experiment with a local copy of the akka-http library. I can publish it locally with sbt publishLocal, but I can't figure out how to change the version number. build.sbt contains an organization field but no simple version field - that seems to be generated from somewhere else and I can't figure out where. It's currently at 10.0.5but grepping that string in the source doesn't turn up anything obvious.
Seems like a simple question, but where is version defined? Thanks.
(I'm asking this because sbt docs tell me I should name my local version something like 0.1-SNAPSHOT. I assume there must be a simpler way to do this than by disabling the auto-generation logic and hardcoding it into build.sbt)

It seems that Akka-HTTP generates its version at run-time.
If you look at akka-http/akka-http-core/src/main/resources/reference.conf:
Akka HTTP version, checked against the runtime version of Akka HTTP.
Loaded from generated conf file.
And then look at akka-http/project/Version.scala:
/**
* Generate version.conf and akka/Version.scala files based on the version setting.
*/
object Version {
def versionSettings: Seq[Setting[_]] = inConfig(Compile)(Seq(
resourceGenerators += generateVersion(resourceManaged, _ / "akka-http-version.conf",
"""|akka.http.version = "%s"
|""").taskValue,
sourceGenerators += generateVersion(sourceManaged, _ / "akka" / "http" / "Version.scala",
"""|package akka.http
|
|import com.typesafe.config.Config
|
|object Version {
| val current: String = "%s"
| def check(config: Config): Unit = {
| val configVersion = config.getString("akka.http.version")
| if (configVersion != current) {
| throw new akka.ConfigurationException(
| "Akka JAR version [" + current + "] does not match the provided " +
| "config version [" + configVersion + "]")
| }
| }
|}
|""").taskValue
))
def generateVersion(dir: SettingKey[File], locate: File => File, template: String) = Def.task[Seq[File]] {
val file = locate(dir.value)
val content = template.stripMargin.format(version.value)
if (!file.exists || IO.read(file) != content) IO.write(file, content)
Seq(file)
}
}
I'm assuming after generating the current version you should see a akka-http-version.conf file somewhere in your filesystem.

Related

path not being detected by Nextflow

i'm new to nf-core/nextflow and needless to say the documentation does not reflect what might be actually implemented. But i'm defining the basic pipeline below:
nextflow.enable.dsl=2
process RUNBLAST{
input:
val thr
path query
path db
path output
output:
path output
script:
"""
blastn -query ${query} -db ${db} -out ${output} -num_threads ${thr}
"""
}
workflow{
//println "I want to BLAST $params.query to $params.dbDir/$params.dbName using $params.threads CPUs and output it to $params.outdir"
RUNBLAST(params.threads,params.query,params.dbDir, params.output)
}
Then i'm executing the pipeline with
nextflow run main.nf --query test2.fa --dbDir blast/blastDB
Then i get the following error:
N E X T F L O W ~ version 22.10.6
Launching `main.nf` [dreamy_hugle] DSL2 - revision: c388cf8f31
Error executing process > 'RUNBLAST'
Error executing process > 'RUNBLAST'
Caused by:
Not a valid path value: 'test2.fa'
Tip: you can replicate the issue by changing to the process work dir and entering the command bash .command.run
I know test2.fa exists in the current directory:
(nfcore) MN:nf-core-basicblast jraygozagaray$ ls
CHANGELOG.md conf other.nf
CITATIONS.md docs pyproject.toml
CODE_OF_CONDUCT.md lib subworkflows
LICENSE main.nf test.fa
README.md modules test2.fa
assets modules.json work
bin nextflow.config workflows
blast nextflow_schema.json
I also tried with "file" instead of path but that is deprecated and raises other kind of errors.
It'll be helpful to know how to fix this to get myself started with the pipeline building process.
Shouldn't nextflow copy the file to the execution path?
Thanks
You get the above error because params.query is not actually a path value. It's probably just a simple String or GString. The solution is to instead supply a file object, for example:
workflow {
query = file(params.query)
BLAST( query, ... )
}
Note that a value channel is implicitly created by a process when it is invoked with a simple value, like the above file object. If you need to be able to BLAST multiple query files, you'll instead need a queue channel, which can be created using the fromPath factory method, for example:
params.query = "${baseDir}/data/*.fa"
params.db = "${baseDir}/blastdb/nt"
params.outdir = './results'
db_name = file(params.db).name
db_path = file(params.db).parent
process BLAST {
publishDir(
path: "{params.outdir}/blast",
mode: 'copy',
)
input:
tuple val(query_id), path(query)
path db
output:
tuple val(query_id), path("${query_id}.out")
"""
blastn \\
-num_threads ${task.cpus} \\
-query "${query}" \\
-db "${db}/${db_name}" \\
-out "${query_id}.out"
"""
}
workflow{
Channel
.fromPath( params.query )
.map { file -> tuple(file.baseName, file) }
.set { query_ch }
BLAST( query_ch, db_path )
}
Note that the usual way to specify the number of threads/cpus is using cpus directive, which can be configured using a process selector in your nextflow.config. For example:
process {
withName: BLAST {
cpus = 4
}
}

Key in Configuration : how to list Configurations and Keys?

The sbt in Action book introduces a concept of Key in Configuration
It then lists the default configurations:
Compile
Test
Runtime
IntegrationTest
Q1) Is it possible to print out a list of all Configurations from a sbt session? If not, can I find information on Configurations in the sbt documentation?
Q2) For a particular Configuration, e.g. 'Compile', is it possible to print out a list of Keys for the Configuration from a sbt session? If not, can I find information on a Configuration's Keys in the sbt documentation?
List of all configurations
For this you can use a setting like so:
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
}
This shows the name of all configurations. You can access more details about each configuration inside the map.
Output from the interactive sbt console:
> allConfs
[info] * provided
[info] * test
[info] * compile
[info] * runtime
[info] * optional
If all you want is to print them you can have a settingKey[Unit] and use println inside the setting definition.
List of all the keys in a configuration
For this we need a task (there might be other ways, but I haven't explored, in sbt I'm satisfied if something works... ) and a parser to parse user input.
All join the above setting in this snippet:
import sbt._
import sbt.Keys._
import complete.DefaultParsers._
val allConfs = settingKey[List[String]]("Returns all configurations for the current project")
val allKeys = inputKey[List[String]]("Prints all keys of a given configuration")
val root = (project in file("."))
.settings(
name := "scala-tests",
allConfs := {
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList
.map(_.name)
},
allKeys := {
val configHints = s"One of: ${
configuration.all(ScopeFilter(inAnyProject, inAnyConfiguration)).value.toList.mkString(" ")
}"
val configs = spaceDelimited(configHints).parsed.map(_.toLowerCase).toSet
val extracted: Extracted = Project.extract(state.value)
val l = extracted.session.original.toList
.filter(set => set.key.scope.config.toOption.map(_.name.toLowerCase)
.exists(configs.contains))
.map(_.key.key.label)
l
}
)
Now you can use it like:
$ sbt "allKeys compile"
If you are in interactive mode you can press tab after allKeys to see the prompt:
> allKeys
One of: provided test compile runtime optional
Since allKeys is a task it's output won't appear on the sbt console if you just "return it" but you can print it.

Premake doesn't picking up dependencies

Recently I've changed from CMake to Premake (v5.0.0-alpha8) and I'm not quite sure how to achieve the the following in Premake.
I want to include some dependencies so in CMake I can do something like this:
target_link_libraries(${PROJECT_NAME}
${YALLA_ABS_PLATFORM}
${YALLA_LIBRARY})
The above will add the paths of these libraries (dir) to "Additional Include Directories" in the compiler and it will also add an entry (lib) to "Additional Dependencies" in the linker so I don't need to do anything special beyond calling target_link_libraries.
So I expected that when I'm doing something like this in Premake:
links {
YALLA_LIBRARY
}
I'd get the same result but I don't.
I also tried to use the libdirs but it doesn't really work, I mean I can't see the library directory and its subdirectories passed to the compiler as "Additional Include Directories" (/I) or Yalla.Library.lib passed to the the linker as "Additional Dependencies".
Here is the directory structure I use:
.
|-- src
| |-- launcher
| |-- library
| | `-- utils
| `-- platform
| |-- abstract
| `-- win32
`-- tests
`-- platform
`-- win32
The library dir is defined in Premake as follow:
project(YALLA_LIBRARY)
kind "SharedLib"
files {
"utils/string-converter.hpp",
"utils/string-converter.cpp",
"defines.hpp"
}
The platform dir is defined in Premake as follow:
project(YALLA_PLATFORM)
kind "SharedLib"
includedirs "abstract"
links {
YALLA_LIBRARY
}
if os.get() == "windows" then
include "win32"
else
return -- OS NOT SUPPORTED
end
The win32 dir is defined in Premake as follow:
files {
"event-loop.cpp",
"win32-exception.cpp",
"win32-exception.hpp",
"win32-window.cpp",
"win32-window.hpp",
"window.cpp"
}
And finally at the root dir I have the following Premake file:
PROJECT_NAME = "Yalla"
-- Sets global constants that represents the projects' names
YALLA_LAUNCHER = PROJECT_NAME .. ".Launcher"
YALLA_LIBRARY = PROJECT_NAME .. ".Library"
YALLA_ABS_PLATFORM = PROJECT_NAME .. ".AbstractPlatform"
YALLA_PLATFORM = PROJECT_NAME .. ".Platform"
workspace(PROJECT_NAME)
configurations { "Release", "Debug" }
flags { "Unicode" }
startproject ( YALLA_LAUNCHER )
location ( "../lua_build" )
include "src/launcher"
include "src/library"
include "src/platform"
I'm probably misunderstanding how Premake works due to lack of experience with it.
I solved it by creating a new global function and named it includedeps.
function includedeps(workspace, ...)
local workspace = premake.global.getWorkspace(workspace)
local args = { ... }
local args_count = select("#", ...)
local func = select(args_count, ...)
if type(func) == "function" then
args_count = args_count - 1
args = table.remove(args, args_count)
else
func = nil
end
for i = 1, args_count do
local projectName = select(i, ...)
local project = premake.workspace.findproject(workspace, projectName)
if project then
local topIncludeDir, dirs = path.getdirectory(project.script)
if func then
dirs = func(topIncludeDir)
else
dirs = os.matchdirs(topIncludeDir .. "/**")
table.insert(dirs, topIncludeDir)
end
includedirs(dirs)
if premake.project.iscpp(project) then
libdirs(dirs)
end
links(args)
else
error(string.format("project '%s' does not exist.", projectName), 3)
end
end
end
Usage:
includedeps(PROJECT_NAME, YALLA_LIBRARY)
or
includedeps(PROJECT_NAME, YALLA_PLATFORM, function(topIncludeDir)
return { path.join(topIncludeDir, "win32") }
end)
Update:
For this to work properly you need to make sure that when you include the dependencies they are included by their dependency order and not by the order of the directory structure.
So for example if I have the following dependency graph launcher --> platform --> library then I'll have to include them in the following order.
include "src/library"
include "src/platform"
include "src/launcher"
As opposed to the directory structure that in my case is as follow:
src/launcher
src/library
src/platform
If you will include them by their directory structure it will fail and tell you that "The project 'Yalla.Platform' does not exist."

Producing two separate jars for sources and resources with package in SBT?

Because of the large size of some resource files, I'd like sbt package to create 2 jar files at the same time, e.g. project-0.0.1.jar for the classes and project-0.0.1-res.jar for the resources.
Is this doable?
[SOLUTION] based on the answer below thanks to #gilad-hoch
1) unmanagedResources in Compile := Seq()
Now it's just classes in the default jar.
2)
val packageRes = taskKey[File]("Produces a jar containing only the resources folder")
packageRes := {
val jarFile = new File("target/scala-2.10/" + name.value + "_" + "2.10" + "-" + version.value + "-res.jar")
sbt.IO.jar(files2TupleRec("", file("src/main/resources")), jarFile, new java.util.jar.Manifest)
jarFile
}
def files2TupleRec(pathPrefix: String, dir: File): Seq[Tuple2[File, String]] = {
sbt.IO.listFiles(dir) flatMap {
f => {
if (f.isFile) Seq((f, s"${pathPrefix}${f.getName}"))
else files2TupleRec(s"${pathPrefix}${f.getName}/", f)
}
}
}
(packageBin in Compile) <<= (packageBin in Compile) dependsOn (packageRes)
Now when I do "sbt package", both the default jar and a resource jar are produced at the same time.
to not include the resources in the main jar, you could simply add the following line:
unmanagedResources in Compile := Seq()
to add another jar, you could define a new task. it would generally be something like that:
use sbt.IO jar method to create the jar.
you could use something like:
def files2TupleRec(pathPrefix: String, dir: File): Seq[Tuple2[File,String]] = {
sbt.IO.listFiles(dir) flatMap {
f => {
if(f.isFile) Seq((f,s"${pathPrefix}${f.getName}"))
else files2TupleRec(s"${pathPrefix}${f.getName}/",f)
}
}
}
files2TupleRec("",file("path/to/resources/dir")) //usually src/main/resources
or use the built-in methods from Path to create the sources: Traversable[(File, String)] required by the jar method.
that's basically the whole deal...

how to set sbt plugins invoke scope?

val webAssemblyTask = TaskKey[Unit](
"web-assembly",
"assembly web/war like run-time package"
)
var out: TaskStreams = _
val baseSettings: Seq[Setting[_]] = Seq(
webAssemblyOutputDir <<= (sourceManaged) { _ / "build" },
webAssemblyTask <<= (
streams,
target,
sourceDirectory,
outputDirProjectName
) map {
(out_log, targetDir, sourceDir, outputDirProjectName) => {
out_log.log.info("web-assembly start")
out_log.log.info("sourceDir:" + sourceDir.getAbsolutePath)
out_log.log.info("targetDir:" + targetDir.getAbsolutePath)
val sourceAssetsDir = (sourceDir / "webapp" / "assets").toPath
val classesAssetsDir = (targetDir / "scala-2.10" / "classes" / "assets").toPath
Files.createSymbolicLink(classesAssetsDir, sourceAssetsDir)
}
}
)
val webAssemblySettings = inConfig(Runtime)(baseSettings)
I wrote a plugin of sbt.
I type webAssembly in sbt console, the plugin run ok.
But I want to run after compile, before runtime, how can I do it?
how to set sbt plugins invoke scope?
I think you're confusing the configuration (also known as Maven scope) name with tasks like compile and run. They happen to have related configuration, but that doesn't mean compile task is identical to Compile configuration.
I could interpret this question to be how can a plugin setting invoke tasks scoped in some other configuration. For that you use in method like: key in (Config) or key in (Config, task). Another way to interpret it may be how can plugin tasks be scoped in a configuration. You use inConfig(Config)(...), which you're already doing. But you'd typically want plugins to be configuration neutral. See my blog post for more details on this.
I want to run after compile, before run, how can I do it?
This makes much more sense. In sbt you mostly focus on the preconditions of the tasks. One of the useful command is inspect tree key. You can run that for run tasks and get the entire tasks/settings that it depends on. Here's where you see it calling compile:compile (another notation for compile in Compile):
helloworld> inspect tree run
[info] compile:run = InputTask[Unit]
[info] +-runtime:fullClasspath = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | +-runtime:exportedProducts = Task[scala.collection.Seq[sbt.Attributed[java.io.File]]]
[info] | | +-compile:packageBin::artifact = Artifact(sbt-sequential,jar,jar,None,List(compile),None,Map())
[info] | | +-runtime:configuration = runtime
[info] | | +-runtime:products = Task[scala.collection.Seq[java.io.File]]
[info] | | | +-compile:classDirectory = target/scala-2.10/sbt-0.13/classes
[info] | | | +-compile:copyResources = Task[scala.collection.Seq[scala.Tuple2[java.io.File, java.io.File]]]
[info] | | | +-compile:compile = Task[sbt.inc.Analysis]
This is useful in discovering compile:products, which "Build products that get packaged" according to help products command:
helloworld> help products
Build products that get packaged.
Since runtime:products happens before compile:run, if it depended on your task, your task will be called before compile:run (inspect tree also shows that run resolved to that).
To simplify your plugin task, I'm just going to call it sayHello:
val sayHello = taskKey[Unit]("something")
sayHello := {
println("hello")
}
You can rewire products in Runtime as follows:
products in Runtime := {
val old = (products in Runtime).value
sayHello.value
old
}
This will satisfy "before run" part. You want to make sure that this runs after compile. Again, just add task dependency to it:
sayHello := {
(compile in Compile).value
println("hello")
}
When the user runs run task, sbt will correct calculate the dependencies and runs sayHello task somewhere between compile and run.

Resources