My remote repos throw no errors, but also have no content - artifactory

Basically what it says on the tin. I have two instances of Artifactory running and I want to set up a local repo on Artifactory instance A to be a remote repo on Artifactory instance B. I type in the correct path (and every plausible variation of it) and when I click "Test" it says everything is good to go. I set up the correct password authentication, too, and clicking "Test" also says everything is good. I made sure that no repo was blacked out or anything and I checked the system logs to ensure that nothing went wrong silently.
And yet, I can't actually query any data from that repo. Artifactory says the remote repo contains zero Artifacts. If I try to download a specific file from that repo, I get a 404.
I tried messing about randomly with the settings, which I suppose unsurprisingly didn't work. But I get no error message, no warnings, no odd behaviors. I don't know what else to try.

So my mistake turned out to be that the repository key it asks for isn't the repository key of the remote repo, but the repository key of the local repo you want to create. I suppose this is there to let you have a remote repo with a different public-facing name than the actual repo it pulls from. Otherwise you have to type in myartifactory.com/my-repo-local and then my-repo-local again in the repository key.

Related

Connect one Artifactory to another Artifactory

Our setup includes a company wide Artifactory that holds in-house-built artifacts as well as goes out and fetches publicly available artifacts. I’m trying to setup a local Artifactory at our location that would fetch publicly available artifacts through the regular internet, but would connect to the company wide Artifactory for our in-house-built artifacts. Is this possible?
In my local Artifactory setup, I put the company wide Artifactory URL as a Remote Repository. I can hit the Test button and it tells me that it successfully connected. However, when I go to download an artifact it does not work. I would like to say that publicly available artifacts can be fetched through my local Artifactory, so at least I can get to jcenter.bintray.
Can one Artifactory be connected to another Artifactory? If yes, is there a way to test if this connection works
I don’t think we would be using all the contents of the company wide Artifactory, so I don’t want to do an export and import to the local or do replication. I would prefer if we could fetch on demand. Is this possible?
Edit: Thanks to #DarthFennec pointing me to Smart Remote Repositories I have solved my problem. To others who have the same problem
Please follow the steps mentioned on the previously mentioned page to set up the Smart Remote Repository. In my case Artifactory did not detect that the remote was another instance of Artifactory and did not give me any options to set, but I was not interested in these anyway.
Note You can always click the Test button to make sure that your connection to the Remote Repository works.
Next, go to the Admin -> Virtual Repositories select your Repository Key and select your Smart Repository from the Available Repositories so that it moves into the Selected Repositories. Click Save & Finish at the bottom and you should be good to go.
I'm not sure exactly what your problem ended up being, but if you want to remote one Artifactory repository from another, it should be a smart remote repository. This is when Artifactory detects that a remote is pointing at another Artifactory, and it enables a number of extra features, like download statistics, property replication, and remote browsing.
An important thing to keep in mind when configuring a smart remote repository is that depending on the package type, you might need to point the remote at <artifactory>/api/<type>/<repo>, rather than just <artifactory>/<repo>. This is the case for Bower, Chef, CocoaPods, Docker, Go, NuGet, Npm, Php Composer, Puppet, Pypi, RubyGems, and Vagrant repositories. Other repository types should use the standard <artifactory>/<repo> URL.

How do I remove artifactory from being my default NPM repository?

I was working on a client contract, and they used Artifactory internally. Now that I'm on a different project, I'd like to remove it. I can't even remember how we set it up now, and I must be Googling for the wrong keywords because I can't find anything on this. I just want to see my packages coming from npm again, but Artifactory won't let me have what I want. What am I missing here?
Remove you ~/.npmrc file and try again.
You may need to reload the default .npmrc file provided with npm (not sure about that).
If you are curious, you can probably look into your ~/.npmrc file and confirm that it contains a line like
registry=http://<artifactory_server>/artifactory/api/<npm_repo>
This is why any request is sent to the Artifactory server rather than the public npm servers.

Why does artifactory create a logical -cache repository for each remote repository?

I'm using artifactory (OSS 5.1.3) as a general build dependency cache. I've noticed that in the repository browser, for each remote repository there is a second entry with -cache appended. ex: "jcenter" and "jcenter-cache".
The -cache entries are created automatically. After I added a generic "gradle-distributions" repository to cache https://services.gradle.org/distributions/, I found that I had a "gradle-distributions-cache" repository in the tree as well. The -cache has a different icon, but it's not listed under any of the different repository types in the admin area, and it's not selectable as a source when defining a virtual repository.
Once I've downloaded an artifact once, I can access it through either the main repository name or the -cache name. But if I haven't downloaded something yet, then the -cache name will 404 (while the main name will go out and fetch it).
I couldn't find anything in the settings or documentation to explain the -cache repository. It's useful as a way of seeing what artifactory has already downloaded from the remote, but is there another explanation for it that I'm not apprehending? Is there a reason to point to one name or another in direct urls? (ex: gradle wrapper --gradle-version 3.4.1 --gradle-distribution-url http://localhost:8081/artifactory/gradle-distributions/gradle-3.4.1-bin.zip) This is mainly a curiosity question.
The "-cahce" repositories are mentioned in the remote repositories configuration section.
The idea is that in some cases it is useful to directly access artifacts that are already stored in the cache (for example to avoid remote update checks).

Nexus proxy will not fetch artifact

We have a Nexus OSS instance set up to host one repo and proxy several others, so the Maven settings.xml is then set up with our instance to be *. This works for most artifacts but one repo fails all of the time.
The failing repo is a snapshot one in another proprietory one within the company and I've set it up as a proxy repo (with snapshots allowed), added this proxy to the main Group and pointed Maven towards http://servername:8081/nexus/content/groups/public/. Maven now fails when it asks for the artifact (and also for the metadata) and indeed browsing to the location it mentions shows that it does not exist. Interestingly, the directory of the SNAPSHOT shows as existing, with only metadata and no artifact or POM, but even the link to maven-metadata.xml fails with a 404.
When I use the group's "Browse Index" tab in the GUI I see the artifact, with a repo path of http://servername:8081/nexus/service/local/repositories/public/content/<groupId/artifactId-with-version> (Not Cached) and this fails too. The remote repository does contain it though!
Actually, going to the proxy in the GUI I can download the artifact from servername:8081/nexus/service/local/repositories/<snapshot-repo>/content/<groupId/artifactId-with-version>. So it feels like maybe a problem with the Group but I can't see any options that I can change to affect this, nor anything in the logs to indicate what's happening.
Although I've seen a couple of similar questions here already, I couldn't see any solution suggested. I'm happy to be proved wrong!
See this article for troubleshooting tips: https://support.sonatype.com/entries/21437881-Troubleshooting-Artifact-Download-Failures
In particular, the ?describe diagnostic URL mentioned at the bottom of the article will help you figure this out.

Local Git Server Setup (using bonobo)

I am trying to setup the Bonobo Git Server to use github on a centralized server. I have followed the instructions given at their documentation. After doing everything as they have described, when I tried to access the Bonobo's git server webpage, I am getting the following error. As I am new to Asp.Net I am not able to completely understand what the problem is.
Can any one please guide me, what is causing this error and how may I solve it?
Plus, if you could tell me any other better Git Server for local setup, that you've tried, it'd be highly appreciated.
I found the solution
IIS - this configuration section cannot be used at this path (configuration locking?)
(The chosen answer helped me).
But unfortunately, now a new error appeared saying
Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive.
I searched and found this:
Unrecognized attribute 'targetFramework'. Note that attribute names are case-sensitive
Kenik's answer
Registering the framework with IIS is what worked for me:
C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319>aspnet_regiis -i
helped me solve this problem. And Voilla, I've setup my Local git server finally :)
Update:
I've had problems connecting to the server after setting up the server. The following sequence worked for me:
Created an empty Git repository at Bonobo Git Server (In my case: localhost://Bonobo.Git.Server.new/). The Username and Password were admin/admin.
Created the users from the user's management control.
Created an empty repository called newproject, added the users to the repository.
At the client side, I used the following command to clone the repository:
git clone http://username#server/Bonobo.Git.Server.new/newproject.git e.g. in my case it was
git clone http://kamran#Bonobo.Git.Server.new/newproject.git
This cloned the repository at the client side with the warning that the repository was empty. But no problem. I moved into the repository, created some files for testing, pushed them to the server using git push. Now to test, if everything was working fine, I viewed the repository history and my commit was being shown there :) To be further assure, I tried on another client the same procedure i.e. cloned the repository, made some changes and pushed them to the server. After that on my first client, I came back and pulled the repository git pull and the changes were there :)
Note
The Server's firewall must be turned off inorder for the client to connect to the server.

Resources