Artifactory - Manage external dependencies - artifactory

I'm wondering how other Artifactory Admins do that so here's my question:
We're starting to use Artifactory to manage our artifacts. Internal as well as external artifacts. The external artifacts are all available in an internal repository. This is so because of a conversion from a file based repository to Artifactory.
Now this is starting to cause issues and I'm wondering how others are managing the external dependencies? As an Artifactory Administrator I want to be sure that my developers only use artifacts which have the correct license so I don't want to have a "feel free to download everything from the internet" culture.
I want to provide some sort of a "whitelisted and approved" set of external Artifacts.
Is this possible using Artifactory OSS or do we manually download the artifacts from a remote repository and deploy it to our local repository?
Thank you in advance!

this can be done with writing a user plugin but it will require a PRO version of Artifactory. You can see here examples to a governance control plugin that was written in the past.
With OSS version you can't reject downloads of users based on license.
Hope that answer your question.

Related

Can you just store binaries?

We are using Artifactory Enterprise and, in addition to "normal" usage, we would like to just store some binaries in Artifactory. This is so we can limit egress and pull the binaries from Artifactory instead of the general Internet. Is this possible? Is there a documentation link that will help explain the process?
Yes, this can be done by creating a generic local repository and deploy the binaries thru UI or using the REST API and you can use the binaries from generic local repository. Refer to this blog as well.

Upgrading Artifactory setup with Remote Repositories

I have an artifactory server, with a bunch of remote repositories.
We are planning to upgrade from 5.11.0 to 5.11.6 to take advantage of a security patch in that version.
Questions are:
do all repositories need to be on exactly the same version?
is there anything else i need to think about when upgrading multiple connected repositories (there is nothing specific about this in the manual)
do i need to do a system-level export just on the primary server? or should i be doing it on all of the remote repository servers
Lastly, our repositories are huge... a full System Export to backup will take too long...
is it enough to just take the config files/dirs
do i get just the config files/dirs by hitting "Exclude Content"
If you have an Artifactory instance that points to other Artifactory instances via smart remote repositories, then you will not have to upgrade all of the instances as they will be able to communicate with each other even if they are not on the same version. With that said, it is always recommended to use the latest version of Artifactory (for all of your instances) in order to enjoy all the latest features and bug fixes and best compatibility between instances. You may find further information about the upgrade process in this wiki page.
In addition, it is also always recommended to keep backups of your Artifactory instance, especially when attempting an upgrade. You may use the built-in backup mechanism or you may manually backup your filestore (by default located in $ARTIFACTORY_HOME/data/filestore) and take DataBase snapshots.
What do you mean by
do all repositories need to be on exactly the same version?
Are you asking about Artifactory instances? Artifactory HA nodes?
Regarding the full system export:
https://www.jfrog.com/confluence/display/RTF/Managing+Backups
https://jfrog.com/knowledge-base/how-should-we-backup-our-data-when-we-have-1tb-of-files/
For more info, you might want to contact JFrog's support.

RPM Remote Repository - Package does not match intended download

We're making use of a remote repository and are storing artifacts locally. However, we are running into a problem because of the fact the remote repository regularly rebuilds all artifacts that it hosts. In our current state, we update metadata (e.x. repodata/repomd.xml), but artifacts are not updated.
We have to continually clear our local remote-repository-cache out in order to allow it to download the rebuilt artifacts.
Is there any way we can configure artifactory to allow it to recache new artifacts as well as the new artifact metadata?
In our current state, the error we regularly run into is
https://artifactory/artifactory/remote-repo/some/path/package.rpm:
[Errno -1] Package does not match intended download.
Suggestion: run yum --enablerepo=artifactory-newrelic_infra-agent clean metadata
Unfortunately, there is no good answer to that. Artifacts under a version should be immutable; it's dependency management 101.
I'd put as much effort as possible to convince the team producing the artifacts to stop overriding versions. It's true that it might be sometimes cumbersome to change versions of dependencies in metadata, but there are ways around it (like resolving the latest patch during development, as supported in the semver spec), and in any way, that's not a good excuse.
If that's not possible, I'd look into enabling direct repository-to-client streaming (i.e. disabling artifact caching) to prevent the problem of stale artifacts.
Another solution might be cleaning up the cache using a user plugin or a script using JFrog CLI once you learn about newer artifacts being published in the remote repository.

Migration of binaries to JFrog Artifactory

Is there a script or any other automated process for migration of artifacts into JFrog? We are currently working on this and need more information to carry out this process. Please help us in achieving this. Thanks in advance.
If you have an existing artifact repository, JFrog Artifactory supports acting as an artifact proxy while you are in the process of migrating to Artifactory.
I would recommend the following:
Create a local repository in artifactory
Create a remote repository in artifactory which points to your current artifact repository.
Create a virtual repository in artifactory which contains both the local and remote repositories.
Iterate on all your projects to have them publish to the local artifactory repository and pull from the virtual repository.
The advantage to this workflow is that you can port things over piece by piece, rather than trying to do it all at once. If you point a dependency at artifactory that hasn't been ported there yet, artifactory will proxy it for you. When the dependency is ported over, it will be transparent to its users.
When you have moved everything to your local Artifactory repository, then you can remove the remote repository from your virtual repository.
The relevant documentation is available here: https://www.jfrog.com/confluence/display/RTF/Configuring+Repositories
For an Enterprise account, I'd suppose S3 storage and a significant number of artifacts, so there will be no easy and automated way to do it. It also highly dependent on the storage implementation of choice in the on-prem solution. If you plan to use S3 storage, JFrog can help to perform S3 replication. In other scenarios, the solution will be different. I suggest contacting the support.

What is the point of Artifactory or Nexus, and how might I use them?

While investigating CI tools, I've found that many installations of CI also integrate to artifact repositories like SonaType Nexus and JFrog Artifactory.
Those tools sound highly integrated to Maven. We do not use Maven, nor do we compile Java even. We compile C++ using Qt/qmake/make, and this build works really well for us. We are still investigating CI tools.
What is the point of using an Artifact repository?
Is archiving to Nexus or Artifactory (or Archiva) supposed to be a step in our make chain, or part of the CI chain, or could it be either?
How might I make our "make" builds or perl/bash/batch scripts interact with them?
An artifact repository has several purposes. The main purpose is to have an copy of maven central (or any other maven repo) to have faster download times and you can use maven even if the internet is down. Since your not using maven this is irrelevant for you.
The second purpose is to store files in it you want to use as dependency but you can not download freely from the internet. So you buy them or get them from your vendors and put them in your repo. This is also more applicable to maven user and there dependency mechanism.
The third important purpose is to have a central way were you can store your releases. So if you build a release v1.0 you can upload it to such a repository and with the clean way of naming in maven its kinda easy to know how to find v1.0 and to use it with all other tools. So you could write a script which downloads your release with wget and install it on a host.
Most of the time these repos have a way of a staging process. So you can store v1.0 in the repo in staging. Someone do the test and when its fine he promotes it to the release repo were everybody can find and use it.
Its simple to integrate them with Maven projects and they are lot of other build tools frameworks with has a easy possiblity to connect against it like ant ivy, groovy grape and so on. Because of the naming schema there is no limitation that you use bash or perl to download/upload files from it.
So if you have releases or files which should be shared between projects and do not have a good solution for it an artefact repository could be good starting point to see how this could work.
As mentioned here:
Providing stable and reliable access to repositories
Supporting a large number of common binaries across different environments
Security and access control
Tracing any action done to a file back to the user
Transferring a large number of binaries to a remote location
Managing infrastructure configuration across different environments

Resources