SBT clone git dependencies to a custom path using a plugin - sbt

I'm creating an aggregate SBT project which depends on several other Git projects. I understand that I can refer to them as a dependency using RootProject(uri("...")) and SBT clones them into an SBT-managed path.
However, I need to download these into a custom path. The idea is to create a workspace that automatically downloads the related Git projects that can be worked on as well.
I was able to create a plugin with a task that clones the git repos using sbt-git plugin:
BundleResolver.scala
def resolve: Def.Initialize[Task[Seq[String]]] = Def.task {
val log = streams.value.log
log.info("starting bundle resolution")
val bundles = WorkspacePlugin.autoImport.workspaceBundles.value
val bundlePaths = bundles.map(x => {
val bundleName = extractBundleName(x)
val localPath = file(".").toPath.toAbsolutePath.getParent.resolveSibling(bundleName)
log.info(s"Cloning bundle : $bundleName")
val (resultCode, outStr, errStr) = runCommand(Seq("git", "clone", x, localPath.toString))
resultCode match {
case 0 =>
log.info(outStr)
log.info(s"cloned $bundleName to path $localPath")
case _ =>
log.err(s"failed to clone $bundleName")
log.err(errStr)
}
localPath.toString
})
bundlePaths
}
WorkspacePlugin.scala
object WorkspacePlugin extends AutoPlugin {
override def trigger = allRequirements
override def requires: Plugins = JvmPlugin && GitPlugin
object autoImport {
// settings
val workspaceBundles = settingKey[Seq[String]]("Dependency bundles for this Workspace")
val stagingPath = settingKey[File]("Staging path")
// tasks
val workspaceClean = taskKey[Unit]("Remove existing Workspace depedencies")
val workspaceImport = taskKey[Seq[String]]("Download the dependency bundles and setup builds")
}
import autoImport._
override lazy val projectSettings = Seq(
workspaceBundles := Seq(), // default to no dependencies
stagingPath := Keys.target.value,
workspaceClean := BundleResolver.clean.value,
workspaceImport := BundleResolver.resolve.value,
)
override lazy val buildSettings = Seq()
override lazy val globalSettings = Seq()
}
However, this will not add the cloned repos as sub projects to the main project. How can I achieve this?
UPDATE:: I had an idea to extend RootProject logic, so that I can create custom projects that would accept a git url, clone it in a custom path, and return a Project from it.
object WorkspaceProject {
def apply(uri: URI): Project = {
val bundleName = GitResolver.extractBundleName(uri.toString)
val localPath = file(".").toPath.toAbsolutePath.getParent.resolveSibling(bundleName)
// clone the project
GitResolver.clone(uri, localPath)
Project.apply(bundleName.replaceAll(".", "-"), localPath.toFile)
}
}
I declared this in a plugin project, but can't access it where I'm using it. Do you think it'll work? How can I access it in my target project?

Can't believe it was this simple.
In my plugin project, I created a new object to use in place of RootProject
object WorkspaceProject {
def apply(uri: URI): RootProject = {
val bundleName = GitResolver.extractBundleName(uri.toString)
val localPath = file(".").toPath.toAbsolutePath.getParent.resolve(bundleName)
if(!localPath.toFile.exists()) {
// clone the project
GitResolver.clone(uri, localPath)
}
RootProject(file(localPath.toString))
}
}
Then use it like this:
build.sbt
lazy val depProject = WorkspaceProject(uri("your-git-repo.git"))
lazy val root = (project in file("."))
.settings(
name := "workspace_1",
).dependsOn(depProject)

Related

Corda - Specifying an app name for a mock network

If I call flowSession.getCounterpartyFlowInfo() from a unit test using MockNetwork, it returns FlowInfo(flowVersion=1, appName=<unknown>)
Here is my current MockNetwork configuration:
network = MockNetwork(
MockNetworkParameters(
cordappsForAllNodes = listOf(
TestCordapp.findCordapp("com.example.contract"),
TestCordapp.findCordapp("com.example.workflow")
),
networkParameters = testNetworkParameters(
minimumPlatformVersion = 5
)
)
)
Is there a way to specify the appName of an application running in a mock network?
I don't think there is a configuration for that. The appName is derived from the jar file name by removing the '.jar' extension.
For the MockNode, the packages are scanned and classes are loaded.
Here is how it's derived:
val Class<out FlowLogic<*>>.appName: String
get() {
val jarFile = location.toPath()
return if (jarFile.isRegularFile() && jarFile.toString().endsWith(".jar")) {
jarFile.fileName.toString().removeSuffix(".jar")
} else {
"<unknown>"
}
}

Kotlin reflection change instance and all members that use the instance

We are using reflection to enable our tests to be started in different environments.
A typical test would look like this:
class TestClass {
val environment: Environment = generateEnvironment("jUnit")
val path: String = environment.path
//Do test stuff
}
We are using reflection like this:
class PostgresqlTest{
val classList: List<KClass<*>> = listOf(TestClass::class)
val postgresEnv = generateEnvironment("postgres")
#TestFactory
fun generateTests(): List<DynamicTest> = classList.flatMap { testClass ->
val instance = testClass.createInstance()
environmentProperty(testclass).setter.call(instance, postgresEnv)
//<<generate the dynamic tests>>
}
fun environmentProperty(testClass: KClass<*>) =
testClass.memberProperties.find {
it.returnType.classifier == Environment::class
} as KMutableProperty<*>
}
Now we have the issue that path != environment.path in the PostgresqlTest
I know this can be solved in the TestClass with lazy or get() like this
class TestClass {
val environment: Environment = generateEnvironment("jUnit")
val path: String by lazy { environment.path }
// OR
val path: String get() = environment.path
}
However this seems like a potential pitfall for future developers, especially since the first code snippet will work in TestClass and only fail for the tests where the environment is overwritten.
What is the cleanest way to ensure that path == environment.path when overwritting the property?
Ideally, if you're using a dependency injection framework (e.g. Dagger) you would want the test classes to just inject the Environment (which would allow referencing the environment path only after it's provided), for example:
class TestClass {
#Inject lateinit var environment: Environment
private lateinit var path: String
#Before fun setup() {
// do injection here
path = environment.path
}
}
Otherwise, I think interface delegation could be a good option here and avoids reflection entirely. For instance, create an EnvironmentHost which surfaces an environment and path property:
interface EnvironmentHost {
var environment: Environment
val path: String
}
Create an implementation here for test classes:
class TestEnvironmentHost : EnvironmentHost {
override var environment: Environment = generateEnvironment("jUnit")
override val path: String
get() = environment.path
}
Test classes now can look like:
class TestClass : EnvironmentHost by TestEnvironmentHost() {
#Test fun myTest() {
val myPath = path
val myEnvironment = environment
}
}
And your test factory can be simplified to:
#TestFactory
fun generateTests(): List<DynamicTests> = classList.flatMap { testClass ->
val instance = testClass.createInstance()
// Assign an environment if the test is an EnvironmentHost. If not,
// you could choose to treat that as a failure and require the test
// class to be an EnvironmentHost.
(instance as? EnvironmentHost)?.environment = postgresEnv
...
}
I ended up creating a new test-task in gradle for each environment:
task postgresqlIntegrationTest(type: Test, group: "Verification", description: "Runs integration tests on postgresql.") {
dependsOn compileTestKotlin
mustRunAfter test
environment "env", "postgresql"
useJUnitPlatform {
filter {
includeTestsMatching "*IT"
}
}
}
where my testclass just loads the environment like this:
class TestClass {
val environment: Environment = generateEnvironment(System.getenv("env") ?: "junit")
//Do test stuff
}

How to get the list of subprojects dynamically in sbt 0.13

How can I programmatically (in build.sbt) find all the subprojects of the current root project in sbt 0.13?
(I have not tried Project.componentProjects yet, because it's new in sbt 1.0).
lazy val root = (project in file(".") ... )
val myTask = taskKey[Unit]("some description")
myTask := {
val masterRoot = baseDirectory.value
// This does not work
// val subProjects: Seq[ProjectReference] = root.aggregate
// So I tried to specify the subproject list explicitly; still does not work
val subProjects = Seq[Project](subPrj1)
subProjects.foreach { subproject =>
// All of this works if the "subproject" is hard-coded to "subPrj1"
val subprojectTarget = target.in(subproject).value / "classes"
val cp = (dependencyClasspath in(subproject, Compile, compile)).value
}
}
Got these errors:
build.sbt: error: Illegal dynamic reference: subproject
val subprojectTarget = target.in(subproject).value / "classes"
^
build.sbt: error: Illegal dynamic reference: subproject
val cp = (dependencyClasspath in(subproject, Compile, compile)).value
You can access a list of all subprojects via buildStructure.value.allProjectRefs.
The other part is of your problem is an aweful issue that I've also faced quite often. I was able to work around such problems by first creating a List[Task[A] and then using a recursive function to lift it into a Task[List[A]].
def flattenTasks[A](tasks: Seq[Def.Initialize[Task[A]]]): Def.Initialize[Task[List[A]]] =
tasks.toList match {
case Nil => Def.task { Nil }
case x :: xs => Def.taskDyn {
flattenTasks(xs) map (x.value :: _)
}
}
myTask := {
val classDirectories: List[File] = Def.taskDyn {
flattenTasks {
for (project ← buildStructure.value.allProjectRefs)
yield Def.task { (target in project).value / "classes" }
}
}.value
}
I've used this approach e.g. here: utility methods actual usage

Is there a way to automatically generate the gradle dependencies declaration in build.gradle?

This is in the context of converting an existing java project into a gradle project.
Is there a tool or webservice that would help generate the dependencies declaration in the build.gradle by pointing to a directory that contain all the dependent jars ?
In general there no such tool but if all the jars all structured (I mean paths: com/google/guava/guava and so on) it should be easy to write such script on your own.
Mind that it can also be done by importing the whole folder in the following way:
repositories {
flatDir {
dirs 'lib'
}
}
or
dependencies {
runtime fileTree(dir: 'libs', include: '*.jar')
}
In my comments to #Opal 's answer, I said that I was working on a quick and dirty groovy script to achieve this. I forgot to attach the script after that. Apologies for the same.
Here's my quick and dirty script. Does solve the purpose partially.
Hoping that someone can improve on it.
#! /usr/bin/env groovy
#Grab(group='org.codehaus.groovy.modules.http-builder', module='http-builder', version='0.5.2' )
import static groovyx.net.http.ContentType.JSON
import groovyx.net.http.RESTClient
import groovy.json.JsonSlurper
import groovy.util.slurpersupport.GPathResult
import static groovyx.net.http.ContentType.URLENC
//def artifactid = "activation"
//def version = "1.1"
//def packaging = "jar"
//
//def mavenCentralRepository = new RESTClient( "http://search.maven.org/solrsearch/select?q=a:%22${artifactid}%22%20AND%20v:%22${version}%22%20AND%20p:%22${packaging}%22&rows=20&wt=json".toString() )
////basecamp.auth.basic userName, password
//
//def response = mavenCentralRepository.get([:])
//println response.data.response.docs
//
//def slurper = new JsonSlurper()
//def parsedJson = slurper.parseText(response.data.toString());
//
//println parsedJson.response.docs.id
def inputFile = new File("input.txt");
def fileList = []
fileList = inputFile.readLines().collect {it.toString().substring(it.toString().lastIndexOf('/') + 1)}
def artifactIDvsVersionMap = [:]
fileList.collectEntries(artifactIDvsVersionMap) {
def versionIndex = it.substring(0,it.indexOf('.')).toString().lastIndexOf('-')
[it.substring(0,versionIndex),it.substring(versionIndex+1).minus(".jar")]
}
println artifactIDvsVersionMap
new File("output.txt").delete();
def output = new File("output.txt")
def fileWriter = new FileWriter(output, true)
def parsedGradleParameters = null
try {
parsedGradleParameters = artifactIDvsVersionMap.collect {
def artifactid = it.key
def version = it.value
def packaging = "jar"
def mavenCentralRepository = new RESTClient( "http://search.maven.org/solrsearch/select?q=a:%22${artifactid}%22%20AND%20v:%22${version}%22%20AND%20p:%22${packaging}%22&rows=20&wt=json".toString() )
def response = mavenCentralRepository.get([:])
println response.data.response.docs.id
def slurper = new JsonSlurper()
def parsedJson = slurper.parseText(response.data.toString());
def dependency = parsedJson.response.docs.id
fileWriter.write("compile '${dependency}'")
fileWriter.write('\n')
sleep (new Random().nextInt(20));
return parsedJson.response.docs.id
}
} finally {
fileWriter.close()
}
println parsedGradleParameters
Groovy pros - Pardon if the code is not not clean. :)

Scala: How do I dynamically instantiate an object and invoke a method using reflection?

In Scala, what's the best way to dynamically instantiate an object and invoke a method using reflection?
I would like to do Scala-equivalent of the following Java code:
Class class = Class.forName("Foo");
Object foo = class.newInstance();
Method method = class.getMethod("hello", null);
method.invoke(foo, null);
In the above code, both the class name and the method name are passed in dynamically. The above Java mechanism could probably be used for Foo and hello(), but the Scala types don't match one-to-one with that of Java. For example, a class may be declared implicitly for a singleton object. Also Scala method allows all sorts of symbols to be its name. Both are resolved by name mangling. See Interop Between Java and Scala.
Another issue seems to be the matching of parameters by resolving overloads and autoboxing, described in Reflection from Scala - Heaven and Hell.
There is an easier way to invoke method reflectively without resorting to calling Java reflection methods: use Structural Typing.
Just cast the object reference to a Structural Type which has the necessary method signature then call the method: no reflection necessary (of course, Scala is doing reflection underneath but we don't need to do it).
class Foo {
def hello(name: String): String = "Hello there, %s".format(name)
}
object FooMain {
def main(args: Array[String]) {
val foo = Class.forName("Foo").newInstance.asInstanceOf[{ def hello(name: String): String }]
println(foo.hello("Walter")) // prints "Hello there, Walter"
}
}
The answers by VonC and Walter Chang are quite good, so I'll just complement with one Scala 2.8 Experimental feature. In fact, I won't even bother to dress it up, I'll just copy the scaladoc.
object Invocation
extends AnyRef
A more convenient syntax for reflective
invocation. Example usage:
class Obj { private def foo(x: Int, y: String): Long = x + y.length }
You can call it reflectively one of
two ways:
import scala.reflect.Invocation._
(new Obj) o 'foo(5, "abc") // the 'o' method returns Any
val x: Long = (new Obj) oo 'foo(5, "abc") // the 'oo' method casts to expected type.
If you call the oo
method and do not give the type
inferencer enough help, it will most
likely infer Nothing, which will
result in a ClassCastException.
Author Paul Phillips
The instanciation part could use the Manifest: see this SO answer
experimental feature in Scala called manifests which are a way to get around a Java constraint regarding type erasure
class Test[T](implicit m : Manifest[T]) {
val testVal = m.erasure.newInstance().asInstanceOf[T]
}
With this version you still write
class Foo
val t = new Test[Foo]
However, if there's no no-arg constructor available you get a runtime exception instead of a static type error
scala> new Test[Set[String]]
java.lang.InstantiationException: scala.collection.immutable.Set
at java.lang.Class.newInstance0(Class.java:340)
So the true type safe solution would be using a Factory.
Note: as stated in this thread, Manifest is here to stay, but is for now "only use is to give access to the erasure of the type as a Class instance."
The only thing manifests give you now is the erasure of the static type of a parameter at the call site (contrary to getClass which give you the erasure of the dynamic type).
You can then get a method through reflection:
classOf[ClassName].getMethod("main", classOf[Array[String]])
and invoke it
scala> class A {
| def foo_=(foo: Boolean) = "bar"
| }
defined class A
scala>val a = new A
a: A = A#1f854bd
scala>a.getClass.getMethod(decode("foo_="),
classOf[Boolean]).invoke(a, java.lang.Boolean.TRUE)
res15: java.lang.Object = bar
In case you need to invoke a method of a Scala 2.10 object (not class) and you have the names of the method and object as Strings, you can do it like this:
package com.example.mytest
import scala.reflect.runtime.universe
class MyTest
object MyTest {
def target(i: Int) = println(i)
def invoker(objectName: String, methodName: String, arg: Any) = {
val runtimeMirror = universe.runtimeMirror(getClass.getClassLoader)
val moduleSymbol = runtimeMirror.moduleSymbol(
Class.forName(objectName))
val targetMethod = moduleSymbol.typeSignature
.members
.filter(x => x.isMethod && x.name.toString == methodName)
.head
.asMethod
runtimeMirror.reflect(runtimeMirror.reflectModule(moduleSymbol).instance)
.reflectMethod(targetMethod)(arg)
}
def main(args: Array[String]): Unit = {
invoker("com.example.mytest.MyTest$", "target", 5)
}
}
This prints 5 to standard output.
Further details in Scala Documentation.
Working up from #nedim's answer, here is a basis for a full answer,
main difference being here below we instantiate naive classes. This code does not handle the case of multiple constructors, and is by no means a full answer.
import scala.reflect.runtime.universe
case class Case(foo: Int) {
println("Case Case Instantiated")
}
class Class {
println("Class Instantiated")
}
object Inst {
def apply(className: String, arg: Any) = {
val runtimeMirror: universe.Mirror = universe.runtimeMirror(getClass.getClassLoader)
val classSymbol: universe.ClassSymbol = runtimeMirror.classSymbol(Class.forName(className))
val classMirror: universe.ClassMirror = runtimeMirror.reflectClass(classSymbol)
if (classSymbol.companion.toString() == "<none>") // TODO: use nicer method "hiding" in the api?
{
println(s"Info: $className has no companion object")
val constructors = classSymbol.typeSignature.members.filter(_.isConstructor).toList
if (constructors.length > 1) {
println(s"Info: $className has several constructors")
}
else {
val constructorMirror = classMirror.reflectConstructor(constructors.head.asMethod) // we can reuse it
constructorMirror()
}
}
else
{
val companionSymbol = classSymbol.companion
println(s"Info: $className has companion object $companionSymbol")
// TBD
}
}
}
object app extends App {
val c = Inst("Class", "")
val cc = Inst("Case", "")
}
Here is a build.sbt that would compile it:
lazy val reflection = (project in file("."))
.settings(
scalaVersion := "2.11.7",
libraryDependencies ++= Seq(
"org.scala-lang" % "scala-compiler" % scalaVersion.value % "provided",
"org.scala-lang" % "scala-library" % scalaVersion.value % "provided"
)
)

Resources