Is it possible to deploy stuff into Artifactory with phing? I couldn't find anything useful by searching since phing is not very popular...
From what I saw, phing supports taskdef for adding additional tasks. If that's true, go ahead and use Ivy. It works perfect with Artifactory.
Related
I am using CSSLint via the CLI. I have spent quite a bit of time following the steps found here and here to write custom css linting rules. They are tested and they work correctly, however now that I have created the rules I am not sure how to actually add them to the linter (or installed via npm) so that can be used via the CLI to lint my projects. I have scoured the documentation in the github wiki and cannot seem to find an answer.
Keep in mind these rules are project specific and they are not meant to be submitted to the csslint github repo.
Figured it out. I used 'grunt release' and then published it as an npm module, and then installed it. The cause for the issue was that I needed to uninstall the original version of cssLint as it was overriding the new one.
Is there any tutorial for the scripted plugin? Or maybe a new way to test plugins. I've found a tutorial that seems to be a bit old.
If I have to resort to Scripted, some questions that come to my mind:
Do I need to publish local my plugin before running scripted?
Can I refer to the version located in version.sbt from my tests?
For the record, I'm also using the cross-build plugin, so if possible, the tests would need to cover both 0.12 and 0.13 versions.
(Author of the linked testing sbt plugins here) There hasn't been major changes to scripted since I first wrote it, but I updated some of the details.
To test the plugin end-to-end, publishing locally I think makes sense.
See the updated post. You can pass version number as a property using scriptedLaunchOpts, and catch it with System.getProperty on the other side.
Eugene's answer is still relevant, but now, Sbt Plugins Testing has a proper documentation page in the official sbt documnetation site:
http://www.scala-sbt.org/release/docs/Testing-sbt-plugins.html
We are looking to use Karaf, but their introduction/quick start (main Karaf website) has almost nothing to say about deploying apps to the container - I know, amazing yet true. Anyone know of a useful introduction for someone completely new to Karaf? Thanks.
I guess you looked at the wrong places the, cause the user and developer dokumentation tells you that you either deploy your artefacts either by dropping in the deploy folder, by installing with osgi:install url, by adding/installing features and so forth. I really recommend RTFM, it's there. In case you still didn't find what your looking for ask the users mailing-list.
At this site you'll find the Karaf Online Documentation
A google for keywords: java, osgi bundle, activator
will turn up examples of how to code for Karaf. For example: http://www.javaworld.com/javaworld/jw-03-2008/jw-03-osgi1.html?page=2
Also, after learning the keywords (OSGI and bungle) I noticed netbeans has a project type of "OSGi Bundle". How nice.
Here are some useful links:
How to use Karaf console: http://karaf.apache.org/manual/latest/users-guide/console.html
Christian Schneider has some useful tutorials as well: http://www.liquid-reality.de/display/liquid/Karaf+Tutorials
Jean-Baptiste Onofre blog (search on Google)
Pay attention that Karaf 3 has slightly different shell commands, so make sure that you are pointing to the right Karaf or you learn how to translate Karaf 2.x commands to Karaf 3.
There are five choices listed in the maven documentation on testing maven plugins:
maven-verifier
maven-invoker-plugin
shitty-maven-plugin
maven-it-plugin
maven-plugin-management-plugin
I've tried a few of these and had a number of problems:
maven-verifier appears to have only a limited set of verifications -- I need to be able to make arbitrary assertions
shitty-maven-plugin has a bug that prevents it working with maven 3
nethier maven-plugin-management-plugin nor maven-it-plugin are stable and don't seem to be under active development
Is anyone able to recommend any of these plugins? Can you provide some example configuration?
The best thing i can recommend is the maven-invoker-plugin, cause it's can handle many situations and produces a real maven environment with all things which you really need to do integration test in relationship with Maven Plugins.
So, it is easy enough to handle external jars when using hadoop straight up. You have -libjars option that will do this for you. The question is how do you do this with EMR. There must be an easy way of doing it. I thought -cachefile option of the CLI would do it, but I couldn't get it working somehow. Any ideas anyone?
Thanks for the help.
The best luck I have had with external jar dependencies is to copy them (via bootstrap action) to /home/hadoop/lib throughout the cluster. That path is on the classpath of every host. This technique is the only one that seems to work regardless of where the code lives that accesses external jars (tool, job, or task).
One option is to have the first step in your jobflow set up the JARs wherever they need to be. Or, if they are dependencies, you can package them in with your application JAR (which is probably in S3).
FYI for newer versions of EMR /home/hadoop/lib is not used anymore. /usr/lib/hadoop-mapreduce should be used.