Composing custom builds - JSON payload examples - artifactory

Are there more examples of custom build JSON payloads beyond that available at https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API? Or perhaps more in-depth documentation on the “application/vnd.org.jfrog.build.BuildsByName+json” payload?
We have a build that produces both JAR/IVY and RPM files (and some other file types that Artifactory doesn’t really know the content of). Today, we publish those into a generic repository to keep everything together.
What would be ideal is to be able to create my own custom build using the REST API, composed of the JAR files + RPM files, so I can do licensing searches across them.
In the given example, the artifacts composed in the build are referenced by ID/name/hash for reference.
The problem with the current Jenkins/Artifactory/Gradle plugin that we use is that our build is separated amongst many smaller builds, but ultimately, are released as one. This makes making a full report somewhat difficult, and doesn’t have any way for us to easily do license checks including RPM files. We want to be able to publish one build, which contains everything we know in the build.
The current setup also has us uploading our JARs into a Maven repository, which adds time to the builds, given we are also publishing the same content into the Generic repository alongside the RPMS and other content.
Thanks!

The build info JSON is fully documented in the README of this
repository: https://github.com/JFrogDev/build-info
Which is also the repository the holds the code of the build info
engine used by the various JFrog CI/Build plugins. You can definitely
create your own BI JSON, and, if you're going to use Java to do that,
you should check out this project that demonstrates the usage of the
various build info Java APIs:
https://github.com/JFrogDev/project-examples/tree/master/build-info-java-example
Another option you may want to look into is the JFrog CLI, which
recently brought support for associating artifact
deployment/resolution with a build object and deploying it to
Artifactory. This method is completely agnostic to the file types
your build produces or the build tool you are using. Have a look at
the official documentation here:
https://www.jfrog.com/confluence/display/CLI/CLI+for+JFrog+Artifactory#CLIforJFrogArtifactory-BuildIntegration
Lastly, if you are using Jenkins, the Jenkins Artifactory Plugin now
has Pipeline APIs that will allow you to collect artifacts and build
information programmatically, and even concatenate multiple build info
objects to deploy them as a single build entity to Artifactory,
which is pretty wicked.
Have a read about this here:
https://wiki.jenkins-ci.org/display/JENKINS/Artifactory+-+Working+With+the+Pipeline+Jenkins+Plugin

Related

Artifactory - Concept of File Versions

I'm currently starting with JFrog Artifactory. Up to now I have only been working with source code control systems not with binary repositories.
Can someone please tell how the versioning of files is done in Artifactory?
I have been trying to deploy a file, then change it and deploy it again.
The checksum has changed, so it's the new file. But it seems that the old version is gone.
So it looks like there are no version of files. If I want that do I have to do it in the filename?
I found versions related to packages.
But I was thinking to use it for other files as well.
Thanks for your help
Christoph
Artifactory, unlike a VCS system, is not managing a history of versions for a given path. When you deploy an artifacts over an existing artifact, it will overwrite it (you can block this by configuring the right permissions).
If you wish to manage permission for generic artifacts (ones which are not managed by a known package manager like npm, Maven etc.), there are a couple of options you can take:
Add the version as part of the artifact name, for example foo-1.0.0.zip
Add the version as part of the artifact path, for example /foo/1.0.0/foo.zip
Combine the 2 above approaches, for example /foo/1.0.0/foo-1.0.0.zip
Use an existing package management tool which is flexible enough to handle generic packages. Many people are using Maven to manage all types of packages beyond Java ones (it comes with its pros and cons)
From the Artifactory point of view there are a couple of capabilities you can leverage:
Generic repositories - aimed at managing proprietary packages which are not managed by a known package manager
Custom repository layout - can be used to define a custom layout for your generic repository and assist with tasks like automatic snapshot version cleanup
Properties - can be used to add version (and other) metadata to your artifacts which can used for searching, querying,resolution and more
Lastly, Conan is another option you should consider. Conan is a package manager intended for C and C++ packages. It is natively supported in Artifactory and can give you a more complete solution for managing your C libraries.

Alter an already published build

Is there a way to alter a build using the CLI after having published it?
UseCase: A deployer (be it automated or manual) wants to add additional files (e.g. Testresult-logs) to an already published build (because they need very long to be created but the artifacts of the fresh build should be published asap).
When I redo a jfrog rt bp over and over again I get "new" builds with the same description (same buildnumer etc.) instead of overwriting/extending existing build.
Appreciating any hint :-)
The main idea of the buildinfo in Artifactory is that they are immutable, which means they cannot be modified post publish. This is to make sure the integrity of them.
In your case, a possible way to achieve this may be:
When publishing the artifacts themselves, you might want to not publish the buildinfo.
You can collect buildinfo through your build cycle, and publish everything as a single buildinfo object after all the tests.

How to promote Build and match custom layout in Artifactory?

I'm using Artifactory Pro with custom repository layouts. I promote my build and move all artifacts to my production repo. But I need to add an article number in this path, so the guys can reference it to their ERP System.
I tried some stuff here, with promoting and moving artifacts to match their needs. It works, but its not nice.
So I added my custom layouts:
For my developement repo:
[org]/[module] ...etc...
For my production repo:
[Articlenr<.*>]/[org]/[module] ...etc...
When I promote my build, my files are stored like this
[Articlenr]/customer/linux ...etc...
The article number is just filled up with [Articlenr], but I'm not able to replace it by the real one, without moving the complete directory.
Anyone here knowing, how to set the article number while promoting this build?
My builds are promoted by JFrog CLI, but using the Artifactory REST API is an option, too.
Thanks a lot!
Currently, there's no way to use the promote command to promote a build with a target path as an argument.
If you are not set on using promotion, consider using the CLI's COPY or MOVE commands, where you can use placeholders in the target path to increment your Articlenr.
The downside of using cp/mv instead of bpr would be the fact that your build will not be flagged as promoted in artifactory (build-info), which may be a problem in some cases (like if you are using build retention for example).
It is not an ideal solution, but it might suffice for what you are trying to accomplish.
HTH, Or

Adobe CQ5 Setup in production

I am not a CQ guy. I have to use CQ5 for one of my project. I have a CAT and a production environment. I have following doubts-
I want to use author instance of my CAT only. Once I publish the content in CAT it should publish in Production also. Is it possible ?
Once I update the build of AdobeCQ in my production say new build, code changes etc- will my content be lost ?
I read somewhere about Content package in cq5. Can I separate content changes and code changes in one CQ5 environment ?
Thanks in advance.
To answer question 1...
This is not a recommended setup, but a common misconception for someone unfamiliar with AEM/CQ5. The "author" and "publish" instances should be part of the same environment. For example you should have a production author, probably behind your firewall, and production publish to serve pages to the public.
Your CAT environment should have the same thing. You want your testing environment to match as closely as possible to your production environment, including web server and dispatcher setup, to ensure quality.
Consider this. You can use one production publish instance, but it's a single point of failure. It's a general best practice to load balance across at least two. Two is sufficient for most websites. If you do this, you'd want to mimic the architecture in CAT.
To answer question 2...
If your code is written, built, and deployed correctly, it should not delete your content. Just make sure you are never deploying anything to /content (to avoid deleting content) and to /libs and most of /etc to avoid overriding platform functionality. AEM/CQ5 is a very open product, so you can do very bad things. But, if you know what not to do you are safe.
Code deployments should typically be done as part of a CRX Content Package, which brings me to...
To answer question 3...
The way we build and deploy code is to have Maven compile the Java, package everything up in a CRX Package, then deploy to the instance using the Package Manager REST API. Adobe provides a Maven Archetype that will facilitate this.
A CRX Package is a file system representation of your content repository, wrapped in what is effectively an annotated Zip file. Your compiled Java code is included in that file system representation, in a folder (to become node) named "config". That compiled Java is an OSGi bundle, which is an annotated JAR. When CRX Package Manager deploys all those nodes to the system, OSGi accepts the bundle, assuming it's valid. This is why you can do "hot" deployments of live, production AEM/CQ5 instances, with very little risk.
So...
This is a very high level answer to some very big topics. I encourage you to do a lot more research before you set this up. There are many good blog posts and documentation pages out there to help you get this set up according to best practice. Good luck!

CCNET - build task required? Multiple repositories, one CCNET source section per project

CCNET questions - Here's the scenario:
I've got 10 developers doing local development to a Sitecore installation w/GIT as version control. When done with their feature/fix they push to an integration repository.
I've got CCNET setup for the Sitecore project that points to the remote Integration rep and the local live qa code base. CCNET finds the commits that my developers have made to integration repository and then updates the qa code base repository.
I also have a couple other .Net class lib projects that are managed by CCNET, compiled with their output pointed to the Sitecore bin dir.
The Sitecore installation is merely a result of a build with no compilable aspects. Its a web product with it's own API as well as the ability to integrate custom dll that we create to customize the product.
Questions:
Is CCNET build task required as a condition to execute other activities such as nUnit or robocopy? (the reason I ask this is because a "build" is natively used to compile an app and generate output, whereas, the only reason why we'd want to build is to make sure all dependencies are there and we can jump to unit testing...).
If my developers are NOT pointing to a centralized rep like integration, how would CCNET know where all of their remote GIT repositories are when the config doc only allows one GIT source control section per project?
Per project when I configure the GIT vc specs it asks for the branch that needs to be statically saved to the doc. Does CCNET have the ability to accept different branches dynamically?
There's no need to have an "actual build" in your project - it could consist of any type of tasks inside the tasks element. I have a couple of projects which only copy the files from the repository to an FTP server after deleting some files which shouldn't be published.
I have no experience with GIT but you have a possibility to define multiple source control blocks of any type if you use the multi source control block.
You could use dynamic parameters which allow the user to set their values when triggering the build.

Resources