How does artifactory resolve dependencies that are available via
http://my.af.com/artifactory/simple/repo
Is this repo path even documented? I can't find anything. It's equivalent of Nexus public I guess, where we can configure inclusions and exclusions to path resolution. Can artifactory do this?
What you are looking for is http://my.af.com/artifactory/repo. This is the default global repository (and, apparently is documented).
Usage of this repository is discouraged, since it defeats the purpose of having multiple repositories. E.g. you can't separate releases from snapshots of build dependencies from build plugins.
While you can define include/exclude patterns on any local, remote or virtual repositories and they are all active when resolving from /repo, you can't define the patterns on /repo itself (since it's not a real configurable virtual repository).
Related
I'm currently starting with JFrog Artifactory. Up to now I have only been working with source code control systems not with binary repositories.
Can someone please tell how the versioning of files is done in Artifactory?
I have been trying to deploy a file, then change it and deploy it again.
The checksum has changed, so it's the new file. But it seems that the old version is gone.
So it looks like there are no version of files. If I want that do I have to do it in the filename?
I found versions related to packages.
But I was thinking to use it for other files as well.
Thanks for your help
Christoph
Artifactory, unlike a VCS system, is not managing a history of versions for a given path. When you deploy an artifacts over an existing artifact, it will overwrite it (you can block this by configuring the right permissions).
If you wish to manage permission for generic artifacts (ones which are not managed by a known package manager like npm, Maven etc.), there are a couple of options you can take:
Add the version as part of the artifact name, for example foo-1.0.0.zip
Add the version as part of the artifact path, for example /foo/1.0.0/foo.zip
Combine the 2 above approaches, for example /foo/1.0.0/foo-1.0.0.zip
Use an existing package management tool which is flexible enough to handle generic packages. Many people are using Maven to manage all types of packages beyond Java ones (it comes with its pros and cons)
From the Artifactory point of view there are a couple of capabilities you can leverage:
Generic repositories - aimed at managing proprietary packages which are not managed by a known package manager
Custom repository layout - can be used to define a custom layout for your generic repository and assist with tasks like automatic snapshot version cleanup
Properties - can be used to add version (and other) metadata to your artifacts which can used for searching, querying,resolution and more
Lastly, Conan is another option you should consider. Conan is a package manager intended for C and C++ packages. It is natively supported in Artifactory and can give you a more complete solution for managing your C libraries.
We use gradle for downloading dependencies (from our local Artifactory instance) and publishing artifacts (to our local Artifactory instance), but need to use a variety of different (non-gradle) tools for doing the build itself.
We therefore have a set of dependencies of various types (test dependencies, compile dependencies, etc) expressed in our gradle file, and our build starts out by running gradle to fetch some of those dependencies.
We then transition to other tools for doing the true build, and then after our build completes, we run gradle artifactoryPublish task to publish the release artifacts back to Artifactory
The Artifactory build-info construct appears to be a great match for tracking our dependencies and release artifacts.
However... the build-info construct published by the gradle artifactoryPublish task to Artifactory does not include any of our dependency information. Our dependency downloads are done in a separate unrelated task, and artifactoryPublish task does not know they are related. Additionally, there are other (test, etc) dependencies expressed in gradle which I would like to have listed in the build-info (so they can be promoted) but which are not needed/downloaded as part of the build process.
I am looking for a way to get ALL the expressed gradle dependencies (some of them optional, and not downloaded before the build) expressed in the Artifactory build-info construct.
I noticed that the jfrog CLI tool has support for expressing build information downloaded via out-of-band mechanisms -- via its build-add-dependencies action. I believe I can explicitly download every dependency (including test dependencies and other optional dependencies not needed by the build), then use this action in order to get those references added to the build-info construct. But... this seems inefficient since I dont actually need all these dependencies for the build, I just want the build-info to reference them.
I also noticed that in the Artifactory REST API there is an option for manually uploading a build-info file. So presumably I could manually create a build-info file referencing dependencies which already exist in Artifactory, without actually downloading them locally. This seems efficient, but... this seems brittle/error-prone since the build-info construct will likely change over time.
Are there any other or better options to consider?
On the Sonatype.com website I can read the following
Nexus is a repository manager. It allows you to proxy, collect, and
manage your dependencies so that you are not constantly juggling a
collection of JARs. It makes it easy to distribute your software.
Internally, you configure your build to publish artifacts to Nexus and
they then become available to other developers. You get the benefits
of having your own ‘central’, and there is no easier way to
collaborate.
The part about "constantly juggling a collection of JAR" I find intriguing.
I my experience, this is exactly what the Nexus process looks like.
As an example. My build is failing with message
[ERROR] Failed to execute goal on project myproject: Could not resolve dependencies for project myproject:jar:1.0.0-SNAPSHOT: Could not find artifact net.sf.saxon:saxon-dom:jar:9.0 in nexus (https://mynexus:8443/nexus/content/groups/public/)
So supposedly the Nexus repo at https://mynexus:8443/nexus/content/groups/public/ does not contain this artifact.
Using the Nexus web interface I can however search and find this particular artifact. It is located in the Jboss Maven2 repo.
What I can also do is navigate the index of the Public Repositories and find this particular artifact saxon-dom version 9.0 manually by expanding the tree navigator. It is in the folder net\sf\saxon.
So my conclusion is that Nexus is exactly not doing what it is claiming to be doing. It is not helping me manage dependencies - I have to resolve those manually.
What results is exactly like constantly juggling collections of jars. I have to manually download those and put them on the class path in order to perform a build.
As a Repository Manager it does not look very useful.
As it turns out, I needed to wrap my brain around the way that Nexus deals with missing dependencies.
This issue I think that artifact saxon-dom was once part of the repo but removed at some point. See https://repository.jboss.org/nexus/content/groups/public/net/sf/saxon/saxon-dom/
So there still is some metadata but not the jar and the pom.
When I search for the artifact Nexus finds it based on this metadata. In search results I can see the jar and pom.
Now I mistakenly thought that the artifacts are found and in the repo. This is not the case because if we try to download the jar you see message similar to below.
So Nexus caches the 404, the fact that it was unable to find the artifact. But it is completely unclear in the UI that the result is a cached 404.
I have a jar dependency that resides on a remote server. How do I resolve that in Gradle? There doesn't seem to be a way to define a repository for remote files, only local files, and I'm something like this doesn't seem to work either:
compile("group:name:version") {
artifact {
url = "http://server/dep.jar"
}
}
The docs seems to hint that something like this should be possible, but so far I'm unable to find an example anywhere.
Support for non-managed dependencies: If your dependencies are simply files in version control or a shared drive, Gradle provides powerful functionality to support this.
Any ideas?
Unfortunately, according to gradle documentation a remote share is currently not supported.
However, this can be worked-around easily by doing the following:
Copy that will copy the remote jars locally. A possible location for such code can be the repositories configuration to ensure that the files will be copied before the dependencies will get resolved.
Define a local repository pointing to the local location of jars.
I have several Scala modules that I'm building with SBT. Some of them (I'll call them dependant modules) are being published to Artifactory and then used by top-level modules.
All changes to the code are done in separate git branches. When the feature (or bugfix) is done, that branch is compiled in Jenkins and then deployed to the testing instance and handed off to the QA team.
So it's possible that there will be several git branches with different code in the dependant modules.
The problem is that Ivy is caching these modules locally, so it's possible that the top-level module will be built with the dependant module from a different branch (taken from the local cache).
I've tried adding changing() directive to the dependency specification in build.sbt.
In that case Ivy is ignoring the local cache and goes to Artifactory to download the POM file every time. Then it parses the POM file, but concludes that it has the jar file with that version in the local cache and fetches the jar file from the local cache and not from Artifactory. Which isn't what I want.
Since the code in branches hasn't been integrated into the master branch at this point, it's perfectly valid that different feature branches have the same version number, but different code.
Is there a way to tell Ivy (via SBT) to ignore local cache for a certain groupid? Or a single dependency at least?
If you are using versioning for your dependant modules then each codebase change must produce a different version. Ivy and maven expect that once the artifact has been published with a specific version it will stay unchanged forever. That is why they are using cached files. If you want to download a fresh version from repository on every compile you should add -SNAPSHOT suffix to the dependant module version number (eg: dep-module-1.1.1-SNAPSHOT)