I have two task nativeJar and native64Jar, manifest and doLast closers are same for both the tasks except the file names. So is It possible to extract that code in a common method and pass two file names as a method parameter and call that common method from both tasks or call that method from dolast clouser.
task nativeJar( type: Jar ) {
doFirst {
delete fileTree(dir: "$releaseDir", include: "*.jar")
}
baseName = 'NativeLibs'
destinationDir = new File(releaseDir)
from files(releaseDir + 'jar_merge/signedNativeLibs')
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'abc', 'Application-Name' : 'WorkBench', 'Codebase' : '*.abc.com'
}
doLast {
ant.signjar( jar: "$releaseDir/NativeLibs.jar", alias:"WorkBench", keystore: "WorkBench.jks", signedjar: "$releaseDir/signedNativeLibs.jar", storepass: "freddie" )
}
}
// Create signedNativeLibs64.jar file
task native64Jar( type: Jar , dependsOn: 'nativeJar' ) {
baseName = 'NativeLibs64'
destinationDir = new File(releaseDir)
from files(releaseDir + 'jar_merge/signedNativeLibs64')
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'abc', 'Application-Name' : 'WorkBench', 'Codebase' : '*.abc.com'
}
doLast {
ant.signjar( jar: "$releaseDir/NativeLibs64.jar", alias:"WorkBench", keystore: "WorkBench.jks", signedjar: "$releaseDir/signedNativeLibs64.jar", storepass: "freddie" )
}
}
I would recommend splitting out the signing as a separate task so that you get proper up-to-date checks from Gradle. As you have it now, you'll always sign the jar every time you build. And if you delete the signed jar, it won't generate again until you clean the native jar too.
You can share configuration closures between tasks. E.g.,
[ task1, task2 ].each { task ->
task.configure {
// shared closure
}
}
There are a few other best practices I'd follow.
Don't use new File() since it makes your script dependent on the current working directory.
Refer to outputs via the task versus recreating the full path (e.g., what you're doing with $releaseDir/NativeLibs.jar). Gradle is able to infer dependencies that way.
Use a custom task class vs an ad-hoc task with doFirst()/doLast(). Since you're delegating all the work to the ant task, this should be really simple.
I'm not sure why you need your particular file names, but I left them as-is. If they're not important, removing them would make this even simpler.
I took a stab at your example (disclaimer: I didn't try it):
task nativeJar( type: Jar ) {
baseName = 'NativeLibs'
from files(releaseDir + 'jar_merge/signedNativeLibs')
}
task native64Jar( type: Jar ) {
baseName = 'NativeLibs64'
from files(releaseDir + 'jar_merge/signedNativeLibs64')
}
[ nativeJar, native64Jar ].each { task ->
task.configure {
destinationDir = file(releaseDir)
manifest {
attributes 'Permissions' : 'all-permissions', 'Publisher' : 'Financial Engineering', 'Application-Name' : 'WorkBench', 'Codebase' : '*.fhlmc.com'
}
}
}
// this class definition should go at the top of your build.gradle script else it will through an exception mentioned in comments
class SignJarTask extends DefaultTask {
#InputFile File inputFile
#OutputFile File outputFile
#TaskAction
void signJar() {
ant.signjar( jar: inputFile, alias:"WorkBench", keystore: "WorkBench.jks", signedjar: outputFile, storepass: "freddie" )
}
}
task signJar(type: SignJarTask) {
inputFile = file("$releaseDir/NativeLibs.jar")
outputFile = file("$releaseDir/signedNativeLibs.jar")
}
task sign64Jar(type: SignJarTask) {
inputFile = file("$releaseDir/NativeLibs64.jar")
outputFile = file("$releaseDir/signedNativeLibs64.jar")
}
Related
I did stackoverflow search and looked at Grunt API docs but couldn't find a way to run a parametrized task using grunt.task.run(taskname).
I have a simple task which accepts a parameter and prints the message on console:
grunt.registerTask('hello', 'greeting task', function(name) {
if(!name || !name.length)
grunt.warn('you need to provide a name');
console.log('hello ' + name + '!');
});
I call the above task using below task which validates the task and if task exists then it runs it:
grunt.registerTask('validateandruntask', 'if task available then run given task', function(taskname) {
if(!taskname || !taskname.length) {
grunt.warn('task name is needed to run this task');
}
if(!grunt.task.exists(taskname)) {
grunt.log.writeln('this task does not exist!');
} else {
grunt.log.writeln(taskname + ' exists. Going to run this task');
grunt.task.run(taskname);
}
});
Now from command line, I am passing 'hello' task as parameter to 'validateandruntask' but I am not been able to pass the parameter to 'hello' task from command line:
This is what I tried on command line but it didn't work:
grunt validateandruntask:hello=foo
grunt validateandruntask:hello:param=name
First thing, the way to pass an arg through the command line is to use :.
For example to call hello directly:
grunt hello:you
To call it with multiple arguments, just separate them by :, like
grunt hello:mister:president
And to use these multiple arguments in the task, you do the same as plain Javascript: use arguments (all details here):
grunt.registerTask('hello', 'greeting task', function(name) {
if(!name || !name.length)
grunt.warn('you need to provide a name');
// unfortunately arguments is not an array,
// we need to convert it to use array methods like join()
var args = Array.prototype.slice.call(arguments);
var greet = 'hello ' + args.join(' ') + '!';
console.log(greet);
});
Then you want to call grunt validateandruntask:hello:mister:president, and modify your code to handle the variable parameters as well:
grunt.registerTask('validateandruntask', 'if task available then run given task', function(taskname) {
if(!taskname || !taskname.length) {
grunt.fail.fatal('task name is needed to run this task');
}
var taskToCall = taskname;
for(var i = 1; i < arguments.length; i++) {
taskToCall += ':' + arguments[i];
}
console.log(taskToCall);
if(!grunt.task.exists(taskname)) {
grunt.log.writeln('this task does not exist!');
} else {
grunt.log.writeln(taskname + ' exists. Going to run this task');
grunt.task.run(taskToCall);
}
});
Is there a built-in function to copy a directory and recursively copy all the files (and other directories) in Dart?
No, not to my knowledge there isn't. But Dart supports basic reading and writing of files from directories, so it stands to reason that this could be solved programmatically.
Check out this gist I found of a tool that would accomplish this process.
Basically, you would search the directory for files you wanted to copy and perform the copy operation:
newFile.writeAsBytesSync(element.readAsBytesSync());
to all file paths, new Path(element.path);, in the new Directory(newLocation);.
Edit:
But this is super inefficient, because the whole files has to be read in by the system and wrote back out to a file. You could just use a shell process spawned by Dart to take care of the process for you:
Process.run("cmd", ["/c", "copy", ...])
Found https://pub.dev/documentation/io/latest/io/copyPath.html (or the sync version of same) which seems to be working for me. It's part of the io package https://pub.dev/documentation/io/latest/io/io-library.html available at https://pub.dev/packages/io.
It does the equivalent of cp -R <from> <to>.
Thank you James, wrote a quick function for it, but did it an alternative way. I'm unsure if this way would be anymore efficient or not?
/**
* Retrieve all files within a directory
*/
Future<List<File>> allDirectoryFiles(String directory)
{
List<File> frameworkFilePaths = [];
// Grab all paths in directory
return new Directory(directory).list(recursive: true, followLinks: false)
.listen((FileSystemEntity entity)
{
// For each path, if the path leads to a file, then add to array list
File file = new File(entity.path);
file.exists().then((exists)
{
if (exists)
{
frameworkFilePaths.add(file);
}
});
}).asFuture().then((_) { return frameworkFilePaths; });
}
Edit: OR! An even better approach (in some situations) would be to return a stream of files in the directory:
/**
* Directory file stream
*
* Retrieve all files within a directory as a file stream.
*/
Stream<File> _directoryFileStream(Directory directory)
{
StreamController<File> controller;
StreamSubscription source;
controller = new StreamController<File>(
onListen: ()
{
// Grab all paths in directory
source = directory.list(recursive: true, followLinks: false).listen((FileSystemEntity entity)
{
// For each path, if the path leads to a file, then add the file to the stream
File file = new File(entity.path);
file.exists().then((bool exists)
{
if (exists)
controller.add(file);
});
},
onError: () => controller.addError,
onDone: () => controller.close
);
},
onPause: () { if (source != null) source.pause(); },
onResume: () { if (source != null) source.resume(); },
onCancel: () { if (source != null) source.cancel(); }
);
return controller.stream;
}
I came across the same problem today.
Turns out the io package from pub.dev solve this in a clean api:
copyPath or copyPathSync
https://pub.dev/documentation/io/latest/io/copyPath.html
What would be the proper gradle way of downloading and unzipping the file from url (http)?
If possible, I'd like to prevent re-downloading each time I run the task (in ant.get can be achieved by skipexisting: 'true').
My current solution would be:
task foo {
ant.get(src: 'http://.../file.zip', dest: 'somedir', skipexisting: 'true')
ant.unzip(src: 'somedir' + '/file.zip', dest: 'unpackdir')
}
still, I'd expect ant-free solution. Any chance to achieve that?
Let's say you want to download this zip file as a dependency:
https://github.com/jmeter-gradle-plugin/jmeter-gradle-plugin/archive/1.0.3.zip
You define your ivy repo as:
repositories {
ivy {
url 'https://github.com/'
patternLayout {
artifact '/[organisation]/[module]/archive/[revision].[ext]'
}
// This is required in Gradle 6.0+ as metadata file (ivy.xml)
// is mandatory. Docs linked below this code section
metadataSources { artifact() }
}
}
reference for required metadata here
The dependency can then be used as:
dependencies {
compile 'jmeter-gradle-plugin:jmeter-gradle-plugin:1.0.3#zip'
//This maps to the pattern: [organisation]:[module]:[revision]:[classifier]#[ext]
}
To unzip:
task unzip(type: Copy) {
def zipPath = project.configurations.compile.find {it.name.startsWith("jmeter") }
println zipPath
def zipFile = file(zipPath)
def outputDir = file("${buildDir}/unpacked/dist")
from zipTree(zipFile)
into outputDir
}
optional:
If you have more than one repository in your project, it may also help (for build time and somewhat security) to restrict dependency search with relevant repositories.
Gradle 6.2+:
repositories {
mavenCentral()
def github = ivy {
url 'https://github.com/'
patternLayout {
artifact '/[organisation]/[module]/archive/[revision].[ext]'
}
metadataSources { artifact() }
}
exclusiveContent {
forRepositories(github)
filter { includeGroup("jmeter-gradle-plugin") }
}
}
Earlier gradle versions:
repositories {
mavenCentral {
content { excludeGroup("jmeter-gradle-plugin") }
}
ivy {
url 'https://github.com/'
patternLayout {
artifact '/[organisation]/[module]/archive/[revision].[ext]'
}
metadataSources { artifact() }
content { includeGroup("jmeter-gradle-plugin") }
}
}
plugins {
id 'de.undercouch.download' version '4.0.0'
}
/**
* The following two tasks download a ZIP file and extract its
* contents to the build directory
*/
task downloadZipFile(type: Download) {
src 'https://github.com/gradle-download-task/archive/1.0.zip'
dest new File(buildDir, '1.0.zip')
}
task downloadAndUnzipFile(dependsOn: downloadZipFile, type: Copy) {
from zipTree(downloadZipFile.dest)
into buildDir
}
https://github.com/michel-kraemer/gradle-download-task
There isn't currently a Gradle API for downloading from a URL. You can implement this using Ant, Groovy, or, if you do want to benefit from Gradle's dependency resolution/caching features, by pretending it's an Ivy repository with a custom artifact URL. The unzipping can be done in the usual Gradle way (copy method or Copy task).
Unzipping using the copy task works like this:
task unzip(type: Copy) {
def zipFile = file('src/dists/dist.zip')
def outputDir = file("${buildDir}/unpacked/dist")
from zipTree(zipFile)
into outputDir
}
http://mrhaki.blogspot.de/2012/06/gradle-goodness-unpacking-archive.html
This works with Gradle 5 (tested with 5.5.1):
task download {
doLast {
def f = new File('file_path')
new URL('url').withInputStream{ i -> f.withOutputStream{ it << i }}
}
}
Calling gradle download downloads the file from url to file_path.
You can use the other methods from other answers to unzip the file if necessary.
I got #RaGe's answer working, but I had to adapt it since the ivy layout method has been depreciated see
https://docs.gradle.org/current/dsl/org.gradle.api.artifacts.repositories.IvyArtifactRepository.html#org.gradle.api.artifacts.repositories.IvyArtifactRepository:layout(java.lang.String,%20groovy.lang.Closure)
So to get it working I had to adjust it to this for a tomcat keycloak adapter:
ivy {
url 'https://downloads.jboss.org/'
patternLayout {
artifact '/[organization]/[revision]/adapters/keycloak-oidc/[module]-[revision].[ext]'
}
}
dependencies {
// https://downloads.jboss.org/keycloak/4.8.3.Final/adapters/keycloak-oidc/keycloak-tomcat8-adapter-dist-4.8.3.Final.zip
compile "keycloak:keycloak-tomcat8-adapter-dist:$project.ext.keycloakAdapterVersion#zip"
}
task unzipKeycloak(type: Copy) {
def zipPath = project.configurations.compile.find {it.name.startsWith("keycloak") }
println zipPath
def zipFile = file(zipPath)
def outputDir = file("${buildDir}/tomcat/lib")
from zipTree(zipFile)
into outputDir
}
"native gradle", only the download part (see other answers for unzipping)
task foo {
def src = 'http://example.com/file.zip'
def destdir = 'somedir'
def destfile = "$destdir/file.zip"
doLast {
def url = new URL(src)
def f = new File(destfile)
if (f.exists()) {
println "file $destfile already exists, skipping download"
} else {
mkdir "$destdir"
println "Downloading $destfile from $url..."
url.withInputStream { i -> f.withOutputStream { it << i } }
}
}
}
I have the following Grunt tasks (simplified):
rev: {
files: {
src: ['dist/**/*.{js,css}']
}
},
processhtml: {
dev: {
options: {
data: {
appJs: grunt.file.expand('dist/**/*.js')
}
},
files: {
'dist/index.html': 'app/index.html'
}
}
}
The grunt-rev task is run first, which takes regular JS and prepends a hash code to the filename. Then the grunt-processhtml task is run, which for this case I want to get all JS filenames generated by grunt-rev, and pass them as custom data.
The issue with this code is it seems the grunt.file.expand method is eagerly executed when the gruntfile is first executed, and not when the processhtml task is run, so it means I get a different list of files from grunt.file.expand than I would expect, as it doesn't take into account the result from the grunt-rev task.
Is there a way to force lazy evaluation of a value when a task is actually run?
I would define a custom task that would (when called) set options for the processhtml task and run it.
Something in the line of:
grunt.task.registerTask('foo', 'My foo task.', function() {
grunt.config("processhtml.dev", {
options: {
data: {
appJsgrunt: file.expand('dist/**/*.js')
}
}
});
grunt.task.run("processhtml.dev");
});
Is there a way to pass an argument from a alias task like this into on of the calling tasks:
grunt.registerTask('taskA', ['taskB', 'taskC'])
grunt taskA:test
so that task taskB and taskC will be called with the parameter test?
You can create a dynamic alias task like this:
grunt.registerTask('taskA', function(target) {
var tasks = ['taskB', 'taskC'];
if (target == null) {
grunt.warn('taskA target must be specified, like taskA:001.');
}
grunt.task.run.apply(grunt.task, tasks.map(function(task) {
return task + ':' + target;
}));
});
Here is the FAQ with another example in the Grunt docs: http://gruntjs.com/frequently-asked-questions#dynamic-alias-tasks