What Is The "Wharf cache default layout" In Artifactory? - artifactory

From the Artifactory documentation at https://www.jfrog.com/confluence/display/JFROG/Repository+Layouts#RepositoryLayouts-BundledLayouts
Bundled Layouts
Artifactory comes out-of-the-box with a number of default, predefined
layouts requiring no additional configuration:
Maven 2/3
Ivy (default layout)
Gradle (Wharf cache default layout)
Maven 1
What is this "Wharf cache default layout" linked to Gradle?
I can access Maven Central perfectly fine with Gradle so I am confused about what this extra layout is.
Is this Gradle layout identical to a Maven 2/3 layout or is it something else?
If I choose Gradle (Wharf cache default layout) in Artifactory, can I access it with both Gradle and Maven or is there something different about it that restricts it to only Gradle use?
If it does restrict use to Gradle only, why would you use it? (since that appear to make it have fewer features than the Maven 2/3 layout)
If Maven can access it too, why would you use it? (as it seems redundant)
Is there some performance boost or some other difference that is a reason to use it?

Maven is the name for both a build tool and a repository type, while Gradle is only a build tool. Gradle can work against Maven, Gradle, and Ivy repositories.
According to JFrog docs Artifactory can be used as the Gradle build cache by simply creating a generic repository in Artifactory. The Wharf cache default layout which was introduced back in Gradle 1.0 and I believe it was meant as a way for caching local cache in a shared manner. Yet this was long ago, and though the layout does remain there, the way Gradle repos work is more adaptive to their use case:
You can now define a layout for each part; plugins resolver, libs resolver, or libs publisher. Gradle being a build tool, and not repository type, needs a private registry that can adapt, and Artifactory makes it really easy to set up.

Related

Artifactory - Concept of File Versions

I'm currently starting with JFrog Artifactory. Up to now I have only been working with source code control systems not with binary repositories.
Can someone please tell how the versioning of files is done in Artifactory?
I have been trying to deploy a file, then change it and deploy it again.
The checksum has changed, so it's the new file. But it seems that the old version is gone.
So it looks like there are no version of files. If I want that do I have to do it in the filename?
I found versions related to packages.
But I was thinking to use it for other files as well.
Thanks for your help
Christoph
Artifactory, unlike a VCS system, is not managing a history of versions for a given path. When you deploy an artifacts over an existing artifact, it will overwrite it (you can block this by configuring the right permissions).
If you wish to manage permission for generic artifacts (ones which are not managed by a known package manager like npm, Maven etc.), there are a couple of options you can take:
Add the version as part of the artifact name, for example foo-1.0.0.zip
Add the version as part of the artifact path, for example /foo/1.0.0/foo.zip
Combine the 2 above approaches, for example /foo/1.0.0/foo-1.0.0.zip
Use an existing package management tool which is flexible enough to handle generic packages. Many people are using Maven to manage all types of packages beyond Java ones (it comes with its pros and cons)
From the Artifactory point of view there are a couple of capabilities you can leverage:
Generic repositories - aimed at managing proprietary packages which are not managed by a known package manager
Custom repository layout - can be used to define a custom layout for your generic repository and assist with tasks like automatic snapshot version cleanup
Properties - can be used to add version (and other) metadata to your artifacts which can used for searching, querying,resolution and more
Lastly, Conan is another option you should consider. Conan is a package manager intended for C and C++ packages. It is natively supported in Artifactory and can give you a more complete solution for managing your C libraries.

Is there a way to reference dependencies in an Artifactory build-info without downloading the dependencies locally?

We use gradle for downloading dependencies (from our local Artifactory instance) and publishing artifacts (to our local Artifactory instance), but need to use a variety of different (non-gradle) tools for doing the build itself.
We therefore have a set of dependencies of various types (test dependencies, compile dependencies, etc) expressed in our gradle file, and our build starts out by running gradle to fetch some of those dependencies.
We then transition to other tools for doing the true build, and then after our build completes, we run gradle artifactoryPublish task to publish the release artifacts back to Artifactory
The Artifactory build-info construct appears to be a great match for tracking our dependencies and release artifacts.
However... the build-info construct published by the gradle artifactoryPublish task to Artifactory does not include any of our dependency information. Our dependency downloads are done in a separate unrelated task, and artifactoryPublish task does not know they are related. Additionally, there are other (test, etc) dependencies expressed in gradle which I would like to have listed in the build-info (so they can be promoted) but which are not needed/downloaded as part of the build process.
I am looking for a way to get ALL the expressed gradle dependencies (some of them optional, and not downloaded before the build) expressed in the Artifactory build-info construct.
I noticed that the jfrog CLI tool has support for expressing build information downloaded via out-of-band mechanisms -- via its build-add-dependencies action. I believe I can explicitly download every dependency (including test dependencies and other optional dependencies not needed by the build), then use this action in order to get those references added to the build-info construct. But... this seems inefficient since I dont actually need all these dependencies for the build, I just want the build-info to reference them.
I also noticed that in the Artifactory REST API there is an option for manually uploading a build-info file. So presumably I could manually create a build-info file referencing dependencies which already exist in Artifactory, without actually downloading them locally. This seems efficient, but... this seems brittle/error-prone since the build-info construct will likely change over time.
Are there any other or better options to consider?

Is it necessary to use gradle with roboelectric for android unit testing?

I and my team have built up an android library project. it is built up on eclipse but we are using ant to build it. Presently we aren't using gradle. I have a roboelectric with dependencies jar file instead. But when I use this, while running the unit tests, the following error comes up
WARNING: multiple versions of ant detected in path for junit
[junit] jar:file:/Users/prateekarora/Desktop/eclipse/plugins/org.apache.ant_1.9.2.v201404171502/lib/ant.jar!/org/apache/tools/ant/Project.class
[junit] and jar:file:/Users/prateekarora/trunk/client/android/MCCMobileClient/test2/libs/robolectric-2.3-with-dependencies.jar!/org/apache/tools/ant/Project.class
When I remove the apache ant from eclipse's plugin folder, this stops working.
Can Anybody explain why this is happening?
Also, is it necessary to use roboelectric with gradle? If no, where can I find the roboelectric's jar files with/without dependencies?
It is not necessary to use gradle with robolectric. It is just about running specified java class (from junit) with proper classpath (including you source, test code and dependencies). Fixing your case is not something that is easy to make over stackoverflow (it will be some challenge even if you sit behind same computer).
Here are possible solutions:
Migrate your project build to the gradle
Keep using ant but move from dependency management from manuals jars to ivy
Keep using ant and manual jars dependency, but try to get robolectric.jar with all dependencies except ant one
The first one option is the easiest option as for me. It will require to change mindset a bit but this is officially only one supported build tool by Google as well there are a lot of examples and people that could help.
The second one also require you to learn how to use new tool. As well there less examples about ivy usage especially in android projects.
The third one will require to write custom script that removes ant from jar file or to rebuild robolectric-all.jar without one (ant) dependency. This will require to dive into maven build tool learning

What is the reason to add Local Maven Repository to sbt?

I keep seeing sbt projects with the following resolvers setting in their build definition:
resolvers += "Local Maven Repository" at "file://"+Path.userHome.absolutePath+"/.m2/repository"
It's also described in the official documentation of sbt in Resolvers section.
My limited knowledge of using sbt and build management tools in general lets me however consider it as a kind of anti-pattern.
The reason is that if a project declares a dependency on a library in the Local Maven Repository it somehow got downloaded there in the first place so it's available somewhere outside the local maven repository. If it is, why not using the original repository as the source?
If the dependency is not in a public repository, and is a dependency of a project, the project can use dependsOn to declare it without the additional repository in resolvers.
Please advice as I may be missing something obvious that makes the resolvers setting indispensable.
One obvious reason would be if one of your dependencies is a local project built with maven.
One scenario:
You have a project x which you build with sbt
X depends on y. Y is built with maven.
There is a bug in y which you need to fix/test, and you want to regression test x before you check it in.
You build a snapshot of y, and then you can test x before you commit the change to y.

Handling server JAR in maven

There are some server jars in my project which i want to migrate to maven ..
I don't have any idea how can i have dependencies attached to these jars.. there are almost 24 jars.. So how can add them to the project scope
The approach you can take depends on whether you have access to the sources of those 'server' jars or not. If you do, then nothing prevents you from creating one/more Maven projects, packaging these and deploying them in your Maven repository.
If you don't have access to the sources and these aren't already available in official Maven repositories, then all you can do is put those in your Maven repository by using maven install:
Often times you will have 3rd party JARs that you need to put in your local repository for use in your builds. The JARs must be placed in the local repository in the correct place in order for it to be correctly picked up by Maven. To make this easier, and less error prone, we have provide a goal in the install plug-in which should make this relatively painless.
mvn install:install-file -Dfile=<path-to-file> -DgroupId=<group-id> \
-DartifactId=<artifact-id> -Dversion=<version> -Dpackaging=<packaging>
Once done for all of these jars, just add dependencies to your project.
I don't recommend you add the server jars in your POM, instead I just use the API jar
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
<version>6.0</version>
<scope>provided</scope>
</dependency>
The advantage is you are conforming to a portable standard. M2E will compile things correctly and it will still run your application correctly when deployed to the runtime provided it supports the API.
If you want to explicitly see it you can add the runtime by going to the project preferences then going to Targetted Runtimes. You only need to do it on the EAR it will do the included projects in the EAR for you. The advantage of adding the targetted runtime is Eclipse may do extra validation specific for your server.

Resources