Is nexus able to provide artifacts of not configured repositories? - nexus

I am using nexus on our companies build server as proxy. Sometimes developers add new dependencies to their projects without telling me. Hence, the list of proxy repositories is sometimes not in sync what is really required. As a result, the jobs in our jenkins build server fail because of missing artificats. The jenkins is configured to use the nexus proxy repositories.
Is it possible to tell nexus to download the artifacts from the original repository if it is not found in the proxied ones?

I assume you mean that developers add repository entries into their Maven pom file to get further dependencies and/or modify their settings.xml.
On the other hand the CI server is configured to get everything from Nexus with mirrorOf *.
There is no automatic addition of repositories based on this setup. You can do two things imho
create scripts that do that for you using the Nexus REST API
or educate your developers to tell you to add the proxy repos to Nexus
Potentially you can even use the Maven enforcer rule to disallow repositories in the POM and set up an explicit message and allow them to create proxy repositories in Nexus. Just dont forget to have them added to the group you are using on the CI server.

Related

Is it possible to synch from our own Artifactory repo to Maven Central?

For some time, we have published all our artifacts to our own repository, which we host ourselves, using JFrog Artifactory.
We have some open source libraries we want to publish to Maven Central, and have come to the point where can publish every new version to Maven Central as a manual step. Now, we want to automate this, and the two options seems to be to either integrate it into our CI workflow or to sync it from our repository. Synching is the easier solution if we can make it work. Sonatype provide some straight forward instructions for doing so with the Nexus Repository Manager here: https://central.sonatype.org/publish/large-orgs/sync/
However, Nexus does not run on Artifactory, so the question is: How do we sync from Artifactory to Maven Central? (Or is it even possible? A confirmation that this is not possible would also be very valuable.)
The use case is to sync the artifacts in Artifactory to the Maven central and it is not possible from the Artifactory side.

Practical Use of Artifactory Repositories

In a near future I will start using Artifactory in my project. I have been reading about local and remote repositories and I am a bit confused of their practical use. In general as far as I understand
Local repositories are for pushing and pulling artifacts. They have no connection to a remote repository (i.e. npm repo at https://www.npmjs.com/)
Remote repositories are for pulling and caching artifacts on demand. It works only one way, it is not possible to push artifacts.
If I am right up to this point, then practically it means you only need a remote repository for npm if you do not develop npm modules but only use them to build your application. In contrast, if you need to both pull and push Docker container images, you need to have one local repository for pushing&pulling custom images and one remote repository for pulling official images.
Question #1
I am confused because our Artifactory admin created a local npm repository for our project. When I discussed the topic with him he told me that I need to first get packages from the internet to my PC and push them to Artifactory server. This does not make any sense to me because I have seen some remote repositories on the same server and what we need is only to pull packages from npm. Is there a point that I miss?
Question #2
Are artifacts at remote repository cache saved until intentionally deleted? Is there a default retention policy (i.e. delete packages older than 6 months)? I ask this because it is important to keep packages until a meteor hits the servers (for archiving policy of the company).
Question #3
We will need to get official Docker images and customize them for CI. It would be a bit hard to maintain one local repo for pulling&pushing custom images and one remote repo for pulling official images. Let's say I need to pull official Ubuntu latest, modify it, push and finally pull the custom image back. In this case it should be pulled using remote repository, pushed to local repo and pulled again from local repo. Is it possible to use virtual repositories to do this seamlessly as one repo?
Question #1 This does not make any sense to me because I have seen some remote repositories on the same server and what we need is only to pull packages from npm. Is there a point that I miss?
Generally, you would want to use a remote repository for this. You would then point your client to this remote repository and JFrog Artifactory would grab them from the remote site and cache them locally, as needed.
In some very secure environments, corporate policies do not even allow this (they may not even be connected to the internet) and instead manually download, vet, and then upload those third-party libraries to a local repository. I don't think that is your case and they may just not understand their intended usages.
Question #2 Are artifacts at remote repository cache saved until intentionally deleted? Is there a default retention policy?
They will not be deleted unless you actively configure it to do so.
For some repo types there are built-in retention mechanisms like the number of snapshots or maximum tags but not for all of them and even in those that have it, they must be actively turned on. Different organizations have different policies for how long artifacts must be maintained. There are a lot of ways to cleanup those old artifacts but ultimately it will depend on your own requirements.
Question #3 Is it possible to use virtual repositories to do this seamlessly as one repo?
A virtual repository will let you aggregate your local and remote sites and appear as a single source. So you can do something like:
docker pull myarturl/docker/someimage:sometag
... docker build ...
docker push myarturl/docker/someimage:sometag-my-modified-version
docker pull myarturl/docker/someimage:sometag-my-modified-version
It is also security-aware so if the user only has access to the local stuff and not the remote stuff, they will only be able to access the local stuff even though they are using the virtual repository that contains both of them.
That said, I don't see why it would be any harder to explicitly use different repositories:
docker pull myarturl/docker-remote/someimage:sometag
... docker build ...
docker push myarturl/docker-local/someimage:sometag-my-modified-version
docker pull myarturl/docker-local/someimage:sometag-my-modified-version
This also has the added advantage that you know they can only pull your modified version of the image and not the remote (though you can also accomplish that by creating the correct permissions).

Artfactory not syncing org tree with jcenter

I'm setting up a new atrifactory installation for the first time in my life. Downloaded the tar and extraceted it ok. Got some firewall rules in place to allow https to jcenter.bintray.com. After an initial refresh I see loads of artifacts in the com tree that must come from jcenter, so all seems fine, but when I preform simple maven tasks like mvn help:active-profiles I only get warnings and errors that indicate that none of the relevant stuff is available from my artifactory.
I have accessed the firewall logs and I found no outgoing traffic from my artifactory server to anything that's not permitted. What have I missed? My artifactory is OSS version 7.5.7 rev 705070900.
Artifactory remote repositories are not working as a mirror or the external repository they are pointing at.
Remote Artifactory repositories are proxying the external repository, which means that you have to actively request for artifacts. When requesting for an artifact, Artifactory will request it from the external repository and cache it inside Artifactory. Farther requests for a cached artifact will be served from Artifactory without the need to go out to the external repository.
The list of artifacts we are seeing, are ones which are available in the external repository. This is a feature is called remote browsing and available for some of the package types supported by Artifactory.
I found the issue, sort of. For reasons I now understand I have plugin repositories. I added the true source for the plugins to my list of plugin repositories, and that solved the issue for me.

Uploading custom jar to cx-server nexus

So, I am trying to set up a CI/CD pipeline with the s4sdk. I successfully completed all the steps descriped in this blog. Everything seems to be running smoothly, however my build is failing with the following error message:
The following artifacts could not be resolved: com.sap.xs2.security:security-commons:jar:0.28.6, com.sap.xs2.security:java-container-security:jar:0.28.6, com.sap.xs2.security:java-container-security-api:jar:0.28.6, com.sap.security.nw.sso.linuxx86_64.opt:sapjwt.linuxx86_64:jar:1.1.19: Could not find artifact com.sap.xs2.security:security-commons:jar:0.28.6 in s4sdk-mirror (http://s4sdk-nexus:8081/repository/mvn-proxy/)
Now, this error messages makes sense to me, since I remember downloading these artifacts from the SAP download center and therefore those artifacts are not available on maven central.
I think this error can be resolved by manually uploading those artifacts to the nexus server, but I don't know how. According to the nexus documentation, there is a web ui reachable under http://< cx-server-ip>:8081, but it is somehow not responding.
I can confirm with docker ps that both the jenkins and nexus container are running and that the nexus container is listening on TCP 8081. I am also able to reach the jenkin's frontend to configure and run my pipeline.
What am I missing? Is uploading the missing artifacts to the nexus the right approach? Any help is appreciated.
The nexus container you see acts as a download cache and is by design not accessible from outside to prevent accidental changes to it. Also, its life-cycle is controlled by the cx-server script, so even if you installed packages there manually, they would be gone once you upgrade the Jenkins.
I think the best way to handle this would be to set up another Nexus instance where you install the required packages and configure the pipeline to use that as described here (mvn_repository_url). This nexus needs to be configured as a mirror for Maven central. We don't have specific docs on how to do that, but this post describes a similar setup.
In this set up, you might want to disable the download cache as it is redundant (cache_enabled to false).
I hope this helps.
Kind regards
Florian
The sidecar nexus acts as a read-only cache for maven and npm artifacts on the host (and agents) where cx server is running. By default it looks up artifacts from maven central and the default npm registry. In the current implementation, the cache will be completely deleted after stopping cx server, leading to a loss of all internal state.
If you want to use custom sources, you can set them in server.cfg via mvn_repository_url and npm_registry_url. This is documented in the operations guide, which you can find here: https://github.com/SAP/cloud-s4-sdk-pipeline/blob/master/doc/operations/operations-guide.md
In your case, you have to specify a maven repository which includes the dependencies in question.

Upgrading Artifactory setup with Remote Repositories

I have an artifactory server, with a bunch of remote repositories.
We are planning to upgrade from 5.11.0 to 5.11.6 to take advantage of a security patch in that version.
Questions are:
do all repositories need to be on exactly the same version?
is there anything else i need to think about when upgrading multiple connected repositories (there is nothing specific about this in the manual)
do i need to do a system-level export just on the primary server? or should i be doing it on all of the remote repository servers
Lastly, our repositories are huge... a full System Export to backup will take too long...
is it enough to just take the config files/dirs
do i get just the config files/dirs by hitting "Exclude Content"
If you have an Artifactory instance that points to other Artifactory instances via smart remote repositories, then you will not have to upgrade all of the instances as they will be able to communicate with each other even if they are not on the same version. With that said, it is always recommended to use the latest version of Artifactory (for all of your instances) in order to enjoy all the latest features and bug fixes and best compatibility between instances. You may find further information about the upgrade process in this wiki page.
In addition, it is also always recommended to keep backups of your Artifactory instance, especially when attempting an upgrade. You may use the built-in backup mechanism or you may manually backup your filestore (by default located in $ARTIFACTORY_HOME/data/filestore) and take DataBase snapshots.
What do you mean by
do all repositories need to be on exactly the same version?
Are you asking about Artifactory instances? Artifactory HA nodes?
Regarding the full system export:
https://www.jfrog.com/confluence/display/RTF/Managing+Backups
https://jfrog.com/knowledge-base/how-should-we-backup-our-data-when-we-have-1tb-of-files/
For more info, you might want to contact JFrog's support.

Resources