I have several Scala modules that I'm building with SBT. Some of them (I'll call them dependant modules) are being published to Artifactory and then used by top-level modules.
All changes to the code are done in separate git branches. When the feature (or bugfix) is done, that branch is compiled in Jenkins and then deployed to the testing instance and handed off to the QA team.
So it's possible that there will be several git branches with different code in the dependant modules.
The problem is that Ivy is caching these modules locally, so it's possible that the top-level module will be built with the dependant module from a different branch (taken from the local cache).
I've tried adding changing() directive to the dependency specification in build.sbt.
In that case Ivy is ignoring the local cache and goes to Artifactory to download the POM file every time. Then it parses the POM file, but concludes that it has the jar file with that version in the local cache and fetches the jar file from the local cache and not from Artifactory. Which isn't what I want.
Since the code in branches hasn't been integrated into the master branch at this point, it's perfectly valid that different feature branches have the same version number, but different code.
Is there a way to tell Ivy (via SBT) to ignore local cache for a certain groupid? Or a single dependency at least?
If you are using versioning for your dependant modules then each codebase change must produce a different version. Ivy and maven expect that once the artifact has been published with a specific version it will stay unchanged forever. That is why they are using cached files. If you want to download a fresh version from repository on every compile you should add -SNAPSHOT suffix to the dependant module version number (eg: dep-module-1.1.1-SNAPSHOT)
Related
What happens when two Julia different project toml files have the same project name and same depot path? Will instantiating one cause other's cache to go stale?
I assume by cache you mean the set of packages stored in a depot.
Pkg.instantiate() will ensure that all package versions which exist in the active dependency graph (as specified by the manifest file) exist somewhere in the depot path. In general, Pkg decouples the set of dependencies required by any given project from the set of packages stored in depots. This is why Julia's projects are so light: different projects are free to share dependencies so that there is no unnecessary duplication.
The fact that two different projects have the same name really has no bearing on this process.
Note: although a given project can only have a single version of a dependency, a depot is free to store any number of versions of the same package.
In case you are referring to the precompile cache: there was an issue with multiple versions of the same package clobbering each other. The fix should be in Julia 1.3.
We use gradle for downloading dependencies (from our local Artifactory instance) and publishing artifacts (to our local Artifactory instance), but need to use a variety of different (non-gradle) tools for doing the build itself.
We therefore have a set of dependencies of various types (test dependencies, compile dependencies, etc) expressed in our gradle file, and our build starts out by running gradle to fetch some of those dependencies.
We then transition to other tools for doing the true build, and then after our build completes, we run gradle artifactoryPublish task to publish the release artifacts back to Artifactory
The Artifactory build-info construct appears to be a great match for tracking our dependencies and release artifacts.
However... the build-info construct published by the gradle artifactoryPublish task to Artifactory does not include any of our dependency information. Our dependency downloads are done in a separate unrelated task, and artifactoryPublish task does not know they are related. Additionally, there are other (test, etc) dependencies expressed in gradle which I would like to have listed in the build-info (so they can be promoted) but which are not needed/downloaded as part of the build process.
I am looking for a way to get ALL the expressed gradle dependencies (some of them optional, and not downloaded before the build) expressed in the Artifactory build-info construct.
I noticed that the jfrog CLI tool has support for expressing build information downloaded via out-of-band mechanisms -- via its build-add-dependencies action. I believe I can explicitly download every dependency (including test dependencies and other optional dependencies not needed by the build), then use this action in order to get those references added to the build-info construct. But... this seems inefficient since I dont actually need all these dependencies for the build, I just want the build-info to reference them.
I also noticed that in the Artifactory REST API there is an option for manually uploading a build-info file. So presumably I could manually create a build-info file referencing dependencies which already exist in Artifactory, without actually downloading them locally. This seems efficient, but... this seems brittle/error-prone since the build-info construct will likely change over time.
Are there any other or better options to consider?
I would like to understand what role the target folder plays in a SOA MDS project.
I am using JDeveloper and the target folder keeps getting populated with 2 .jar files. I am not sure where these jar files are coming from, but they contain old data which should be changed.
Can somebody please help me understand what is behind the making of these files?
The target folder is the default build output directory used by maven.
If working correctly, the builds should be generated there by maven using the configuration specified in the pom.xml file. In your case, the maven build might not have been run recently, which is why you see old content in the jars.
Have a look inside the pom.xml and see what build configuration has been specified there (it is likely to be no different from a SOA composite maven build file/pom file). If it's all built correctly, you should be able to deploy that jar directly to the MDS runtime (either manually or via maven).
In the pom file, you should be able to override most things there including the name, version, bundle type, target directory etc.
You can also use maven to keep track of your MDS changes - i.e. version it like any other build artifact/SOA composite. The versioned jars can also be uploaded to an artifact repository (such as nexus), in addition to being deployed to MDS runtime, so you have good level of traceability of MDS changes
PS -
This might help explain more: http://weblog.singhpora.com/2016/10/managing-shared-metadata-mds-in-ci.html
I am trying to achieve what a resourceGenerator in Runtime would do: create a resource that is available on the classpath during runtime, however that would not be packaged under the main configuration.
In my specific case, I am trying to create an sbt plugin that facilitates dealing with JNI native libraries. The above mentioned resource would be a "fat" jar containing a shared library, thus it is not required for compilation but only during runtime.
My goal in the end is to publish the standard jar (in the Compile configuration) and publish the fat jar as an extra artifact (in the Runtime configuration). However, during local testing, I would like the shared libraries to be available on the classpath when simply calling run from sbt.
I tried implementing a resourceGenerator in Runtime, however with no success. An alternative approach I could imagine would be to modify runtime:exportedProducts or alter runtime:managedClasspath directly, however I first wanted to know if there is already a way to include resources only in the runtime configuration?
CCNET questions - Here's the scenario:
I've got 10 developers doing local development to a Sitecore installation w/GIT as version control. When done with their feature/fix they push to an integration repository.
I've got CCNET setup for the Sitecore project that points to the remote Integration rep and the local live qa code base. CCNET finds the commits that my developers have made to integration repository and then updates the qa code base repository.
I also have a couple other .Net class lib projects that are managed by CCNET, compiled with their output pointed to the Sitecore bin dir.
The Sitecore installation is merely a result of a build with no compilable aspects. Its a web product with it's own API as well as the ability to integrate custom dll that we create to customize the product.
Questions:
Is CCNET build task required as a condition to execute other activities such as nUnit or robocopy? (the reason I ask this is because a "build" is natively used to compile an app and generate output, whereas, the only reason why we'd want to build is to make sure all dependencies are there and we can jump to unit testing...).
If my developers are NOT pointing to a centralized rep like integration, how would CCNET know where all of their remote GIT repositories are when the config doc only allows one GIT source control section per project?
Per project when I configure the GIT vc specs it asks for the branch that needs to be statically saved to the doc. Does CCNET have the ability to accept different branches dynamically?
There's no need to have an "actual build" in your project - it could consist of any type of tasks inside the tasks element. I have a couple of projects which only copy the files from the repository to an FTP server after deleting some files which shouldn't be published.
I have no experience with GIT but you have a possibility to define multiple source control blocks of any type if you use the multi source control block.
You could use dynamic parameters which allow the user to set their values when triggering the build.