SBT publishing snapshots to sonatype - sbt

I am working on a project and the source got open sourced and we've decided to publish to maven central.
https://github.com/mdsol/mauth-java-client/tree/refactor/publish_to_sonatype
Currently we are publishing to an internal repo and it allows publishing of SNAPSHOTS with timestamps so we can publish multiple time the same snapshot version. However, it looks like sonatype doesn't allow uploading with timestamps or overwriting.
How do I delete existing snapshot of sonatype so new one can be published as part of the sbt build? Also, do I sonatypeRelease the snapshots?

You have to create a Sonatype account, and then log in via their web front-end: https://oss.sonatype.org/
Once you did a sbt publishSigned, for example, you can then search for your package in the (staging) repositories and, if you are indeed logged in while doing that, delete or release it. (I found that sbt sonatypeRelease did not reliably release my package. So I ended up using their web front-end exclusively for the latter.)
This is outlined, more or less, also in the official sbt documentation:
https://www.scala-sbt.org/1.x/docs/Using-Sonatype.html#Third+-+Publish+to+the+staging+repository

Related

Cannot see the builds after publishing the artifacts

I am new to Artifactory and just going through the guides and trying out some sample code.
I am trying to publish a maven artifact to my artifactory server. I have followed the following steps to do it:
Through the "Set me up" tool, get the settings.xml file for maven.
Download and place the settings.xml file to the ~/.m2 folder
Clone the maven example repo from the artifactory examples available on git.
Update the pom.xml file and add the distributionManagement tag provided in the "Set me up" window.
Publish the code using mvn deploy
The binaries are published to the artifactory server and are available in the Artifact Repository Browser window. But I do not see any corresponding builds in the Build Browser. I also do not get any builds if I try to fetch them using the REST API
What am I missing? I followed the above steps because I saw it on the Introduction to Artifactory webinar video. Is there any setting that I need to change to see the builds in the Build Browser window?
I am using Artifactory Version 5.10.3 (OSS)
I think there is a misunderstanding, the maven example plugin is used for publishing artifacts to Artifactory, however, it doesn't publish Build Info.
In order for you to publish build info to Artifactory, you will either need to use CI server with Artifactory plugin (for example, Jenkins, Bamboo, Teamcity) or use the Maven Artifactory plugin:
https://www.jfrog.com/confluence/display/RTF/Maven+Artifactory+Plugin
That have the build information publish step inside.

Composing custom builds - JSON payload examples

Are there more examples of custom build JSON payloads beyond that available at https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API? Or perhaps more in-depth documentation on the “application/vnd.org.jfrog.build.BuildsByName+json” payload?
We have a build that produces both JAR/IVY and RPM files (and some other file types that Artifactory doesn’t really know the content of). Today, we publish those into a generic repository to keep everything together.
What would be ideal is to be able to create my own custom build using the REST API, composed of the JAR files + RPM files, so I can do licensing searches across them.
In the given example, the artifacts composed in the build are referenced by ID/name/hash for reference.
The problem with the current Jenkins/Artifactory/Gradle plugin that we use is that our build is separated amongst many smaller builds, but ultimately, are released as one. This makes making a full report somewhat difficult, and doesn’t have any way for us to easily do license checks including RPM files. We want to be able to publish one build, which contains everything we know in the build.
The current setup also has us uploading our JARs into a Maven repository, which adds time to the builds, given we are also publishing the same content into the Generic repository alongside the RPMS and other content.
Thanks!
The build info JSON is fully documented in the README of this
repository: https://github.com/JFrogDev/build-info
Which is also the repository the holds the code of the build info
engine used by the various JFrog CI/Build plugins. You can definitely
create your own BI JSON, and, if you're going to use Java to do that,
you should check out this project that demonstrates the usage of the
various build info Java APIs:
https://github.com/JFrogDev/project-examples/tree/master/build-info-java-example
Another option you may want to look into is the JFrog CLI, which
recently brought support for associating artifact
deployment/resolution with a build object and deploying it to
Artifactory. This method is completely agnostic to the file types
your build produces or the build tool you are using. Have a look at
the official documentation here:
https://www.jfrog.com/confluence/display/CLI/CLI+for+JFrog+Artifactory#CLIforJFrogArtifactory-BuildIntegration
Lastly, if you are using Jenkins, the Jenkins Artifactory Plugin now
has Pipeline APIs that will allow you to collect artifacts and build
information programmatically, and even concatenate multiple build info
objects to deploy them as a single build entity to Artifactory,
which is pretty wicked.
Have a read about this here:
https://wiki.jenkins-ci.org/display/JENKINS/Artifactory+-+Working+With+the+Pipeline+Jenkins+Plugin

Nexus error "Could not find artifact" when the same Nexus displays artifact metadata

On the Sonatype.com website I can read the following
Nexus is a repository manager. It allows you to proxy, collect, and
manage your dependencies so that you are not constantly juggling a
collection of JARs. It makes it easy to distribute your software.
Internally, you configure your build to publish artifacts to Nexus and
they then become available to other developers. You get the benefits
of having your own ‘central’, and there is no easier way to
collaborate.
The part about "constantly juggling a collection of JAR" I find intriguing.
I my experience, this is exactly what the Nexus process looks like.
As an example. My build is failing with message
[ERROR] Failed to execute goal on project myproject: Could not resolve dependencies for project myproject:jar:1.0.0-SNAPSHOT: Could not find artifact net.sf.saxon:saxon-dom:jar:9.0 in nexus (https://mynexus:8443/nexus/content/groups/public/)
So supposedly the Nexus repo at https://mynexus:8443/nexus/content/groups/public/ does not contain this artifact.
Using the Nexus web interface I can however search and find this particular artifact. It is located in the Jboss Maven2 repo.
What I can also do is navigate the index of the Public Repositories and find this particular artifact saxon-dom version 9.0 manually by expanding the tree navigator. It is in the folder net\sf\saxon.
So my conclusion is that Nexus is exactly not doing what it is claiming to be doing. It is not helping me manage dependencies - I have to resolve those manually.
What results is exactly like constantly juggling collections of jars. I have to manually download those and put them on the class path in order to perform a build.
As a Repository Manager it does not look very useful.
As it turns out, I needed to wrap my brain around the way that Nexus deals with missing dependencies.
This issue I think that artifact saxon-dom was once part of the repo but removed at some point. See https://repository.jboss.org/nexus/content/groups/public/net/sf/saxon/saxon-dom/
So there still is some metadata but not the jar and the pom.
When I search for the artifact Nexus finds it based on this metadata. In search results I can see the jar and pom.
Now I mistakenly thought that the artifacts are found and in the repo. This is not the case because if we try to download the jar you see message similar to below.
So Nexus caches the 404, the fact that it was unable to find the artifact. But it is completely unclear in the UI that the result is a cached 404.

How to publish sbt plugin to nexus repositories?

I am new to sbt plugin publishing and I have just rewrote some features of an exist plugin. It has been working locally sine I ran publish-local on sbt console. Now, I want publish it to nexus repositories. Are there any good tutorial to do it?
Right now, publishing sbt-plugins to nexus repos can cause some issues, but generally, this should apply: http://www.scala-sbt.org/0.13/docs/Using-Sonatype.html
Additionally, you want to make sure in your plugin's build.sbt file:
sbtPlugin := true
publishMavenStyle := true.
There are a few issue currently:
sbt-plugins are not legitimate maven artifacts, which is why most sbt plugins are published to "raw" repositories in this fashion: http://www.scala-sbt.org/0.13/docs/Bintray-For-Plugins.html
Nexus, sometimes, will generate pom.xml files for ivy-deployed artifacts. This can seriously mess with sbt's resolution.
That said, a few users are, and have been, successfully deploying plugins to maven central or nexus repositories. We're actively working on sbt-ivy integration currently, so you will hopefully see more guidance in the nexus + sbt area soon.

CCNET - build task required? Multiple repositories, one CCNET source section per project

CCNET questions - Here's the scenario:
I've got 10 developers doing local development to a Sitecore installation w/GIT as version control. When done with their feature/fix they push to an integration repository.
I've got CCNET setup for the Sitecore project that points to the remote Integration rep and the local live qa code base. CCNET finds the commits that my developers have made to integration repository and then updates the qa code base repository.
I also have a couple other .Net class lib projects that are managed by CCNET, compiled with their output pointed to the Sitecore bin dir.
The Sitecore installation is merely a result of a build with no compilable aspects. Its a web product with it's own API as well as the ability to integrate custom dll that we create to customize the product.
Questions:
Is CCNET build task required as a condition to execute other activities such as nUnit or robocopy? (the reason I ask this is because a "build" is natively used to compile an app and generate output, whereas, the only reason why we'd want to build is to make sure all dependencies are there and we can jump to unit testing...).
If my developers are NOT pointing to a centralized rep like integration, how would CCNET know where all of their remote GIT repositories are when the config doc only allows one GIT source control section per project?
Per project when I configure the GIT vc specs it asks for the branch that needs to be statically saved to the doc. Does CCNET have the ability to accept different branches dynamically?
There's no need to have an "actual build" in your project - it could consist of any type of tasks inside the tasks element. I have a couple of projects which only copy the files from the repository to an FTP server after deleting some files which shouldn't be published.
I have no experience with GIT but you have a possibility to define multiple source control blocks of any type if you use the multi source control block.
You could use dynamic parameters which allow the user to set their values when triggering the build.

Resources