Artifactory Replication by default \ setup once, copy to others - artifactory

I'm currently setting up my main instances of Jfrog Artifactory Enterprise Edition and I'd like to be able to just input repositories once and then by default, have it configured correctly on the second master instance for Event based replication and for pull replication on the edge nodes.
Is this possible?

If I understood you would like to set up some kind of DR where you would replicate the data from master1 to master2 and to do so you need to have the exact replica of repository configuration from master1 to master2. For now, you can set up all repositories in the master1 and then take the artifactory.config.latest.xml under the <JFrog_Home>/artifactory/var/etc/artifactory folder from master1 and import it in the master2 by copying the file into the master2's same location <JFrog_Home>/artifactory/var/etc/artifactory folder and change the name of the file from artifactory.config.latest.xml to artifactory.config.import.xml and restart the master2.
I would also recommend referring to the below articles,
KB article
Wiki on configuration files

Related

Artifactiry migration from 4.0.2 to 7

I migrated from an artifactory 4.0.2 to a 7.last suing method 1 suggested by https://jfrog.com/knowledge-base/what-is-the-best-way-to-migrate-a-large-artifactory-instance-with-minimal-downtime/.
Now surfing into the new repository I see all definitions fof my repositories as well as all metadata, however I don't see any artifact content (e.g. jar files).
I copied all filestore content from the old to the instance.
What I missed ?
Tks
From the scenario you've described, it appears to be an issue with the storage (filestore) configuration/connection.
What do we need to check next?
If you have updated the filestore path as a part of the migration, update the path into the storage configuration file [binarystore.xml] in the Artifactory instance. Let's say, you have moved the data from one mount to another mount to use the data via the new Artifactory version 7.x, this step will help Artifactory to know where exactly it needs to look out for the binaries.
The file will be present under $JFROG_HOME/etc/artifactory/ directory in the Artifactory 7.x version.

Connect one Artifactory to another Artifactory

Our setup includes a company wide Artifactory that holds in-house-built artifacts as well as goes out and fetches publicly available artifacts. I’m trying to setup a local Artifactory at our location that would fetch publicly available artifacts through the regular internet, but would connect to the company wide Artifactory for our in-house-built artifacts. Is this possible?
In my local Artifactory setup, I put the company wide Artifactory URL as a Remote Repository. I can hit the Test button and it tells me that it successfully connected. However, when I go to download an artifact it does not work. I would like to say that publicly available artifacts can be fetched through my local Artifactory, so at least I can get to jcenter.bintray.
Can one Artifactory be connected to another Artifactory? If yes, is there a way to test if this connection works
I don’t think we would be using all the contents of the company wide Artifactory, so I don’t want to do an export and import to the local or do replication. I would prefer if we could fetch on demand. Is this possible?
Edit: Thanks to #DarthFennec pointing me to Smart Remote Repositories I have solved my problem. To others who have the same problem
Please follow the steps mentioned on the previously mentioned page to set up the Smart Remote Repository. In my case Artifactory did not detect that the remote was another instance of Artifactory and did not give me any options to set, but I was not interested in these anyway.
Note You can always click the Test button to make sure that your connection to the Remote Repository works.
Next, go to the Admin -> Virtual Repositories select your Repository Key and select your Smart Repository from the Available Repositories so that it moves into the Selected Repositories. Click Save & Finish at the bottom and you should be good to go.
I'm not sure exactly what your problem ended up being, but if you want to remote one Artifactory repository from another, it should be a smart remote repository. This is when Artifactory detects that a remote is pointing at another Artifactory, and it enables a number of extra features, like download statistics, property replication, and remote browsing.
An important thing to keep in mind when configuring a smart remote repository is that depending on the package type, you might need to point the remote at <artifactory>/api/<type>/<repo>, rather than just <artifactory>/<repo>. This is the case for Bower, Chef, CocoaPods, Docker, Go, NuGet, Npm, Php Composer, Puppet, Pypi, RubyGems, and Vagrant repositories. Other repository types should use the standard <artifactory>/<repo> URL.

Running access for Artifactory

New to Artifactory so please bear with me.
Trying (and failing) to create new access token.
The GUI in Artifactory has nothing for this but points to a users guide (https://www.jfrog.com/confluence/display/RTF/Access+Tokens) which talks about managing access tokens through a WAR file.
Here is the blurb:
Access Service
From Artifactory version 5.4, access tokens are managed under a new service
called Access which is implemented in a separate WAR file, access.war. This
change has no impact on how access tokens are used, however, the Artifactory
installation file structure now also includes the added WAR file under the
$ARTIFACTORY_HOME/webapps folder. Artifactory communicates with the Access
service over HTTP and assumes it is running in the same Tomcat using the
context path of "access".
OK, great. So how do I access this thing?
I also don't know much about web apps/servers. Prior to today, I thought WAR was a fight between nations :-)
My Artifactory server proc is running, and I can confirm that the access war file (apparently a jar file of sorts) is in the webapps dir.
I am able to get a artifactory via "http://myserver:8081/artifactory/webapp/#/home".
As it turns out, I believe the interface to manage access tokens is not provided through a gui. Rather, you have to use REST and curl commands.
The documentation mentions:
It is up to the Artifactory administrator to make sure that all participating instances are equipped with the same key pair.
That means you need to have access to the server (where Artifactory is installed).
On that server, the folder where Artifactory is installed is reference ARTIFACTORY_HOME.
That is what is used in the next doc extract:
Start up the first Artifactory instance (or cluster node for an HA installation) that will be in your circle of trust. A private key and root certificate are generated and stored under $ARTIFACTORY_HOME/access/etc/keys.
Copy the private key and root certificate files to a location on your file system that is accessible by all other instances/nodes that are in your circle of trust.
Before bootstrapping, for each of the other instances/nodes, create the $ARTIFACTORY_HOME/access/etc folder and create a properties file in it called access.bootstrap.config with the following contents:
key=/path/to/private.key
crt=/path/to/root.crt
When each instance/node starts up, if the $ARTIFACTORY_HOME/access/etc/access.bootstrap.config file exists, then the private key and root certificate are copied from the specified location into the server's home directory under $ARTIFACTORY_HOME/access/etc/keys.

Why does artifactory create a logical -cache repository for each remote repository?

I'm using artifactory (OSS 5.1.3) as a general build dependency cache. I've noticed that in the repository browser, for each remote repository there is a second entry with -cache appended. ex: "jcenter" and "jcenter-cache".
The -cache entries are created automatically. After I added a generic "gradle-distributions" repository to cache https://services.gradle.org/distributions/, I found that I had a "gradle-distributions-cache" repository in the tree as well. The -cache has a different icon, but it's not listed under any of the different repository types in the admin area, and it's not selectable as a source when defining a virtual repository.
Once I've downloaded an artifact once, I can access it through either the main repository name or the -cache name. But if I haven't downloaded something yet, then the -cache name will 404 (while the main name will go out and fetch it).
I couldn't find anything in the settings or documentation to explain the -cache repository. It's useful as a way of seeing what artifactory has already downloaded from the remote, but is there another explanation for it that I'm not apprehending? Is there a reason to point to one name or another in direct urls? (ex: gradle wrapper --gradle-version 3.4.1 --gradle-distribution-url http://localhost:8081/artifactory/gradle-distributions/gradle-3.4.1-bin.zip) This is mainly a curiosity question.
The "-cahce" repositories are mentioned in the remote repositories configuration section.
The idea is that in some cases it is useful to directly access artifacts that are already stored in the cache (for example to avoid remote update checks).

R Studio - Cloning local repository

I want to create a master repository on our server, from which I can clone a local version onto my computer.
I am using R Studio v0.98.994.
So far, this is what I have tried doing:
Create a folder for the master repository to live in. I do this using 'new project' in R studio, and tell it to make a git repository.
I can then open up another new project, located on my C drive, and use R studio to clone, by telling it to open an existing project and setting the URL as the location of the master project.
However, then when I make changes and commit to my local repository (which works fine) I cannot push to the master repository, I get an error exactly as described in this question: git push fails: `refusing to update checked out branch: refs/heads/master`
So it appears that R Studio creates non-bare repositories?
Now I thought, well okay, I will use git bash to initialise the repository and then connect to that within R studio.
I do so, but cannot then find a way to use that repository in R Studio.
I am very new to Git, so it is entirely probable that this is one of those 'read the instructions' questions, in which case I am very sorry - and could someone possibly point me towards some guidance for this situation? I have spent the better half of a day googling around this error and haven't yet managed to pull together the pieces :( I also apologise; this doesn't feel like a very reproducible question.
It sounds like you are using Windows Git, with a setup on a local Windows machine (C: drive) and a server of some kind, mounted as the S: drive. There's a few things you should be aware of when doing this.
Shared Repositories
If you are intending for multiple people to share the same repository, you want to initiate a shared repository. See the --shared option in git-init for more details. Note that I'm not sure how having your repository on a Windows machine affects the sharing options. If you are just trying to keep your repository in two places, that makes things a lot easier.
Bare Repositories
Separate from the discussion of sharing is the discussion of bare repositories. If you don't intend to ever work with files in the server (i.e. it's just going to be a place to push changes so they are safely stored), you could initialize a bare repository. A bare repository contains the database structure of Git, but does not have the actual files in the directory.
A standard Git repository is a directory with a hidden folder in it named .git. This .git folder contains all the various data structures that Git uses to track changes. A bare repository is essentially a folder containing only the contents of .git.
The good thing about a bare repository is that no one can work in the repository itself (since there is no working directory, just the database). This means that no one could log into S: and edit the repository themselves. Instead, they would have to clone the repository, then push their changes back to the origin. The GitGuys have a good article about why this is ideal.
Note that shared repos and bare repos are not dependent or mutually exclusive. As a general practice, if you are having a "server repo" from which you pull and to which you push, you should have it be bare, regardless of whether the project is shared.
A Non-Shared Workflow
Since it's not clear if you are sharing or not sharing and you're on a Windows environment, which I don't know about from a sharing standpoint, I'm going to give you a simple example. Using git-bash, you should be able to change directories to wherever on S: you have your repositories. Then, use git init with the bare options as described by the link above to initialize a bare repository. Navigate to where you want your repository to live on C:, and then do git clone to get a working copy.
Add a README file or something else so you can do your initial commit, and then commit and do git push origin master to push your changes to the S: repository. Once all that is done, THEN initialize the RStudio Git project. RStudio should defer to your existing configuration, and things should hopefully work.

Resources