How to invoke npm via sbt in a multi-module project - sbt

I am using sbt to build a multi-module project. One of the modules is a JavaScript application which builds with npm. I would like to simply execute npm via a shell script as part of the package task, and use the resulting output file as the artifact for that module. I am able to run a shell command as part of the package task, but for some reason this task is ignored when I run publish or publishLocal.
Attached is my Build.scala.
It is the accounts-ui project which should build using npm. Right now the actual npm build is represented by a simple script.
import sbt._
import Keys._
import play.Play.autoImport._
import PlayKeys._
import play.PlayScala
import sbtassembly._
import AssemblyKeys._
import net.virtualvoid.sbt.graph.Plugin.graphSettings
import ohnosequences.sbt.SbtS3Resolver.autoImport._
object Build extends Build {
lazy val commonSettings = Seq(
organization := "myorg",
scalaVersion := "2.11.5"
)
lazy val publishSettings = Seq(
publishTo := {
val prefix = if (isSnapshot.value) "snapshots" else "releases"
Some(s3resolver.value("MyOrg " + prefix + " S3 bucket", s3(prefix+".repo.myorg.com")))
}
)
lazy val root = Project(id = "root", base = file(".")).settings(commonSettings).settings(
name := "accounts-root"
).settings(publishSettings).aggregate(api, ui)
val _apiName = "accounts-api"
lazy val api = Project(id = "api", base = file("./api")).settings(commonSettings).settings(
name := "accounts-api",
libraryDependencies ++= Seq(
specs2
)
).settings(publishSettings).settings(graphSettings).settings(
mainClass in assembly := Some("play.core.server.NettyServer"),
fullClasspath in assembly += Attributed.blank(PlayKeys.playPackageAssets.value),
test in assembly := {},
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp filter { (el) => {
val name = el.data.getName
name.contains("mockito") || name.contains("commons-logging") || name.contains("specs2")
}
}
}
).settings(addArtifact(Artifact(_apiName, "assembly"), assembly)
).enablePlugins(PlayScala)
val npmBuildTask = taskKey[Unit]("some custom task")
lazy val ui = Project(id = "ui", base = file("./ui")).settings(commonSettings).settings(
name := "accounts-ui",
npmBuildTask := {
val processBuilder = Process("npm-build.sh")
val process = processBuilder.run()
if(process.exitValue() != 0)
throw new Error(s"custom task failed with exit value ${process.exitValue()}")
},
Keys.`package` <<= (Keys.`package` in Compile) dependsOn npmBuildTask
).settings(publishSettings)
}

I was able to solve it as follows:
val npmPackageTask = taskKey[File]("npm package task")
lazy val ui = Project(id = "ui", base = file("./ui")).settings(commonSettings).settings(
name := "accounts-ui",
npmPackageTask := {
val processBuilder = Process("npm-build.sh")
val process = processBuilder.run()
if(process.exitValue() != 0)
throw new Error(s"custom task failed with exit value ${process.exitValue()}")
file(".")
},
packageBin in Compile <<= npmPackageTask
).settings(publishSettings)
The key was to create the key as taskKey[File], use the packageBin key, and replace the task with the <<= operator.

Related

fastapi's class as dependency override in unit testing

Using the parameter supplied in the request body, I wanted to instantiate an object. For instance, if the request body contains type=push I desired to instantiate in this manner.
if channel_type.upper() == "PUSH":
return NotificationService(Push())
I am able to achieve this through classes as dependency of fastapi as following:
def factory(channel_type):
if channel_type.upper() == "PUSH":
return NotificationService(Push())
elif channel_type.upper() == "EMAIL":
return NotificationService(Email())
else:
return NotificationService(Sms())
class NotificationRequest:
def __init__(self, type: str = Body("push")):
self.service = factory(type)
class MessageRequest(BaseModel):
title: str
message: str
payload: Dict = {}
#router.post("/", response_model=dict)
def notify(message: MessageRequest, request: NotificationRequest = Depends()) -> Any:
request.service.send(users=get_users(), message=Message(
title=message.title, message=message.message, payload=message.payload
))
return {"success": True}
However, I wanted to override the NotificationRequest dependency in the unit test so that I could provide fake instances of real classes:
def get_factory(channel_type):
if channel_type.upper() == "PUSH":
return NotificationService(FakePush())
elif channel_type.upper() == "EMAIL":
return NotificationService(FakeEmail())
else:
return NotificationService(FakeSms())
I have tried following:
app.dependency_overrides[factory] = lambda: get_factory
which fails because the factory is not injected through the dependency.
How can we give a parameter to the body and inject a dependency into the function to produce a fake instance for unit testing?

Golang web app localization

I have a web app written in golang, and I am planning to make it available in more than one language, I've taken a look at multiple available l18n packages but some things were not clear to me.
What packages would be ideal to determine the users locale and load the site accordingly? Like from browser preferences or location?
You can use https://github.com/nicksnyder/go-i18n/
Then in your project you have to create a folder called i18n/ and use a function like this:
import (
"fmt"
"io/ioutil"
"github.com/nicksnyder/go-i18n/i18n"
)
func loadI18nFiles() {
files, _ := ioutil.ReadDir("i18n")
exists := false
for _, file := range files {
if err := i18n.LoadTranslationFile(fmt.Sprintf("i18n/%s", file.Name())); err != nil {
log.Errorf("i18n: error loading file %s. err: %s", file.Name(), err)
} else {
log.Infof("i18n: lang file %s loaded", file.Name())
}
# Check if you have a default language
if file.Name() == fmt.Sprintf("%s.json", "en-US") {
exists = true
}
}
if !exists {
panic(fmt.Sprintf("Hey! You can't use a default language (%s) that doesn't exists on i18n folder", props.DefaultLang))
}
}
Then to use, import the package and call the function:
T, _ := i18n.Tfunc("es-AR", "en-US")
fmt.Printf(T("key"))
Each file inside i18n folder is a .json
Example:
en-US.json
[
{
"id": "key",
"translation": "Hello World"
}
]
es-AR.json
[
{
"id": "key",
"translation": "Hola Mundo"
}
]

sbt mappings in Universal error after sbt upgrade to 0.13.13

mappings in Universal <++= (packageBin in Compile, sourceDirectory) map { (_, src) =>
val confFiles = (src / "main" / "resources") ** "*.conf"
confFiles.get.map(file => file -> ("conf/" + file.name))
},
Works but generates a compiler warning <++= has been deprecated. Changing the operator to ++= generates a compiler error
error: No implicit for Append.Values[Seq[(java.io.File, String)], sbt.Def.Initialize[sbt.Task[Seq[(java.io.File, String)]]]] found,
so sbt.Def.Initialize[sbt.Task[Seq[(java.io.File, String)]]] cannot be appended to Seq[(java.io.File, String)]
mappings in Universal ++= (packageBin in Compile, sourceDirectory) map { (_, src) =>
Here is how I solved it
mappings in Universal ++= { (packageBin in Compile, sourceDirectory) map { (_, src) =>
val confFiles = (src / "main" / "resources") ** "*.conf"
confFiles.get.map(file => file -> ("conf/" + file.name))
}
}.value,
Even better is
mappings in Universal ++= {
val src = sourceDirectory.value
val confFiles = (src / "main" / "resources") ** "*.conf"
confFiles.get.map(file => file -> ("conf/" + file.name))
}
This operator is very confusing. Try a simpler := which is functionally equivalent:
mappings.in(Universal) := {
// Dependency on packageBin (part of your previous definition).
packageBin.in(Compile).value
// Create the new mappings.
val confFiles = (sourceDirectory.value / "main" / "resources") ** "*.conf"
val newMappings = confFiles.get.map(file => file -> ("conf/" + file.name))
// Append them manually to the previous value.
mappings.in(Universal).value ++ newMappings
}

How to extract common code from gradle jar task to a mthod

I have two task nativeJar and native64Jar, manifest and doLast closers are same for both the tasks except the file names. So is It possible to extract that code in a common method and pass two file names as a method parameter and call that common method from both tasks or call that method from dolast clouser.
task nativeJar( type: Jar ) {
doFirst {
delete fileTree(dir: "$releaseDir", include: "*.jar")
}
baseName = 'NativeLibs'
destinationDir = new File(releaseDir)
from files(releaseDir + 'jar_merge/signedNativeLibs')
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'abc', 'Application-Name' : 'WorkBench', 'Codebase' : '*.abc.com'
}
doLast {
ant.signjar( jar: "$releaseDir/NativeLibs.jar", alias:"WorkBench", keystore: "WorkBench.jks", signedjar: "$releaseDir/signedNativeLibs.jar", storepass: "freddie" )
}
}
// Create signedNativeLibs64.jar file
task native64Jar( type: Jar , dependsOn: 'nativeJar' ) {
baseName = 'NativeLibs64'
destinationDir = new File(releaseDir)
from files(releaseDir + 'jar_merge/signedNativeLibs64')
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'abc', 'Application-Name' : 'WorkBench', 'Codebase' : '*.abc.com'
}
doLast {
ant.signjar( jar: "$releaseDir/NativeLibs64.jar", alias:"WorkBench", keystore: "WorkBench.jks", signedjar: "$releaseDir/signedNativeLibs64.jar", storepass: "freddie" )
}
}
I would recommend splitting out the signing as a separate task so that you get proper up-to-date checks from Gradle. As you have it now, you'll always sign the jar every time you build. And if you delete the signed jar, it won't generate again until you clean the native jar too.
You can share configuration closures between tasks. E.g.,
[ task1, task2 ].each { task ->
task.configure {
// shared closure
}
}
There are a few other best practices I'd follow.
Don't use new File() since it makes your script dependent on the current working directory.
Refer to outputs via the task versus recreating the full path (e.g., what you're doing with $releaseDir/NativeLibs.jar). Gradle is able to infer dependencies that way.
Use a custom task class vs an ad-hoc task with doFirst()/doLast(). Since you're delegating all the work to the ant task, this should be really simple.
I'm not sure why you need your particular file names, but I left them as-is. If they're not important, removing them would make this even simpler.
I took a stab at your example (disclaimer: I didn't try it):
task nativeJar( type: Jar ) {
baseName = 'NativeLibs'
from files(releaseDir + 'jar_merge/signedNativeLibs')
}
task native64Jar( type: Jar ) {
baseName = 'NativeLibs64'
from files(releaseDir + 'jar_merge/signedNativeLibs64')
}
[ nativeJar, native64Jar ].each { task ->
task.configure {
destinationDir = file(releaseDir)
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'Financial Engineering', 'Application-Name' : 'WorkBench', 'Codebase' : '*.fhlmc.com'
}
}
}
// this class definition should go at the top of your build.gradle script else it will through an exception mentioned in comments
class SignJarTask extends DefaultTask {
#InputFile File inputFile
#OutputFile File outputFile
#TaskAction
void signJar() {
ant.signjar( jar: inputFile, alias:"WorkBench", keystore: "WorkBench.jks", signedjar: outputFile, storepass: "freddie" )
}
}
task signJar(type: SignJarTask) {
inputFile = file("$releaseDir/NativeLibs.jar")
outputFile = file("$releaseDir/signedNativeLibs.jar")
}
task sign64Jar(type: SignJarTask) {
inputFile = file("$releaseDir/NativeLibs64.jar")
outputFile = file("$releaseDir/signedNativeLibs64.jar")
}

How to get the scopes of the running task?

I want to get the project and configuration axis from within the task. For example, considering the following task:
myTask := {
val project = ?
val configuration = ?
val key = ?
println(s"project: $project")
println(s"configuration: $configuration")
println(s"key: $key")
}
If I run the task like this,
> myModule/myConfig:myTask
it should print
project: myModule
configuration: myConfig
key: myTask
My partial solution in build.sbt would be as follows:
lazy val myTask = taskKey[Unit]("Prints axes of its execution")
lazy val myTaskSetting = myTask := {
val project = thisProject.value.id
val cfg = configuration.?.value
val key = myTask.key.label
println(s"project: $project")
println(s"configuration: $cfg")
println(s"key: $key")
}
myTaskSetting
lazy val a, b = project settings (Seq(myTaskSetting): _*)
I've no idea how to access the current configuration the task is bound to upon execution.
> myTask
project: axes
configuration: None
key: myTask
project: b
configuration: None
key: myTask
project: a
configuration: None
key: myTask

Resources