I use JFrog XRay v1.10.1 with Artifactory v5.2.1 (both PRO versions).
I cannot found in the XRay documentation (and Google) how XRay automatically re-scan artifacts that have not changed in Artifactory when the vulnerabilities database is updated.
What is the re-scan policy followed by XRay ?
Thanks in advance :)
Xray keeps a graph of all the scanned component and the relationships between them, for example if a certain Java library is part of a war file.
When a new vulnerability is added to the database, Xray will check if the effected component appears in the dependency graph and if so will check how it impact the rest of the graph. For example if a debian package inside a Docker image is found to be effected Xray will also mark the Docker image as impacted. This is called impact analysis in the Xray terminology.
This is explained in the documentation in the watches section.
Related
I am looking for an REST API/AQL that can be used to generate the number of artifacts blocked for download as part of the XRAY watch violations in JFROG
As per this publicly available page, the REST API is not available at the moment which can reveal the artifacts that are not downloadable because of vulnerabilities.
When we download an artifact that is having a vulnerability, then Xray can indicate if the artifact is not downloadable because of vulnerabilities init.
For some time, we have published all our artifacts to our own repository, which we host ourselves, using JFrog Artifactory.
We have some open source libraries we want to publish to Maven Central, and have come to the point where can publish every new version to Maven Central as a manual step. Now, we want to automate this, and the two options seems to be to either integrate it into our CI workflow or to sync it from our repository. Synching is the easier solution if we can make it work. Sonatype provide some straight forward instructions for doing so with the Nexus Repository Manager here: https://central.sonatype.org/publish/large-orgs/sync/
However, Nexus does not run on Artifactory, so the question is: How do we sync from Artifactory to Maven Central? (Or is it even possible? A confirmation that this is not possible would also be very valuable.)
The use case is to sync the artifacts in Artifactory to the Maven central and it is not possible from the Artifactory side.
My build pipeline is uploading a multi-module Maven project to an Artifactory repository repo1. At the very end of the pipeline (and for certain conditions), there is a step that will copy one single artifact from repo1 to repo2. I'm using the REST API: $(ARTIFACTORY_URL)/api/copy/repo1/org/sonarsource/sonarlint/eclipse/org.sonarlint.eclipse.site/{version}/org.sonarlint.eclipse.site-{version}.zip?to=/repo2/org.sonarlint.eclipse.site-latest.zip&suppressLayouts=1
This works fine, but then I noticed that when looking at the build's published modules in the build details, in front of the artifact, the repo path is now pointing to repo2/org.sonarlint.eclipse.site-latest.zip. When browsing the repo1 tree, I can see that org.sonarlint.eclipse.site-{version}.zip is still there.
Then I tried to promote the build, but this fails. The zip is not promoted to the final repository. I found this in logs:
2020-06-12 07:56:10,266 [http-nio-8082-exec-3579] [WARN ] (o.a.a.l.t.PathTranslationHelper:68) - Unable to translate path 'org.sonarlint.eclipse.site-latest.zip': does not represent a valid module path within the source.
2020-06-12 07:56:10,895 [http-nio-8082-exec-3579] [INFO ] (o.a.b.BuildPromotionHelper:214) - Skipping promotion status update: item promotion was completed with errors and warnings.
Question: shouldn't the copy operation preserve the repo path of the copied artifact? Why pointing to the new copy? Or at least, shouldn't the promote operation be smart enough to deal with this?
With the help of JFrog support, we managed to get rid of this issue. When using the promote REST API, we had to specify the sourceRepo parameter .
Long story:
As you can see based on this KB article , once a Build Promotion is triggered, Artifactory will first search for a list of artifacts it should promote. It will search based on the SHA1 values from the Build-info JSON, and the build.name & build.number properties.
Since copied artifacts have the same SHA1 (they are basically symlink), the promote will randomly take the artifact from any of the two repos where the copy exists. Specifying the sourceRepo parameter to the build promotion step will force it to only consider this repo, and in my case that fixed the issue.
Is there a script or any other automated process for migration of artifacts into JFrog? We are currently working on this and need more information to carry out this process. Please help us in achieving this. Thanks in advance.
If you have an existing artifact repository, JFrog Artifactory supports acting as an artifact proxy while you are in the process of migrating to Artifactory.
I would recommend the following:
Create a local repository in artifactory
Create a remote repository in artifactory which points to your current artifact repository.
Create a virtual repository in artifactory which contains both the local and remote repositories.
Iterate on all your projects to have them publish to the local artifactory repository and pull from the virtual repository.
The advantage to this workflow is that you can port things over piece by piece, rather than trying to do it all at once. If you point a dependency at artifactory that hasn't been ported there yet, artifactory will proxy it for you. When the dependency is ported over, it will be transparent to its users.
When you have moved everything to your local Artifactory repository, then you can remove the remote repository from your virtual repository.
The relevant documentation is available here: https://www.jfrog.com/confluence/display/RTF/Configuring+Repositories
For an Enterprise account, I'd suppose S3 storage and a significant number of artifacts, so there will be no easy and automated way to do it. It also highly dependent on the storage implementation of choice in the on-prem solution. If you plan to use S3 storage, JFrog can help to perform S3 replication. In other scenarios, the solution will be different. I suggest contacting the support.
I'm wondering how other Artifactory Admins do that so here's my question:
We're starting to use Artifactory to manage our artifacts. Internal as well as external artifacts. The external artifacts are all available in an internal repository. This is so because of a conversion from a file based repository to Artifactory.
Now this is starting to cause issues and I'm wondering how others are managing the external dependencies? As an Artifactory Administrator I want to be sure that my developers only use artifacts which have the correct license so I don't want to have a "feel free to download everything from the internet" culture.
I want to provide some sort of a "whitelisted and approved" set of external Artifacts.
Is this possible using Artifactory OSS or do we manually download the artifacts from a remote repository and deploy it to our local repository?
Thank you in advance!
this can be done with writing a user plugin but it will require a PRO version of Artifactory. You can see here examples to a governance control plugin that was written in the past.
With OSS version you can't reject downloads of users based on license.
Hope that answer your question.