Artifactory: Converting remote repo to local repo - artifactory

My employer has been misusing Bintray as our binary repository for some time. We are finally moving to Artifactory instead and closing down Bintray. But this seems to be an almost impossible task. There is no way of exporting Bintray repos to a zip. Downloading the repos means manually downloading each file from the UI or through their API. I have tried two approaches for automation:
1) wget for crawling our bintray like this:
wget -e robots=off -o ~/wget.log -w 1 -m -np --user --password "https://.bintray.com"
which yielded all of the files in the repos. But this only solves half the problem. I couldn't find out how to import the files to a repository in artifactory (all the repos are over 100mbs each and therefore can't be uploaded, for some reason).
2) I set the Bintray repos up as remote repositories and enabled Active Replication. That seems to have worked for now. But I don't know if they will be removed when the Bintray account is moved or even if they are stored in Artifactory. Therefore I would like to convert the remote repo to a local repo, to make sure that it is permanently stored in artifactory is there a way of doing this? If so, how?

I'll try to address both of your questions below.
What do you mean you can't upload more than 100mb? Which version of Artifactory are you using? On-prem or SaaS-based installation? How are you trying to upload your files to Artifactory? Have you tried to import the content by using the import feature of Artifactory? (Admin --> Import&Export --> repository Import)
It sounds like you are using the UI for the upload, and if so you can configure the max upload size in Admin --> General Configuration page.
If you mean that you have all of the content from Bintray cached in your remote repository cache in Artifactory just use the "Copy" or "Move" option and move the content to a local repository. This will ensure that all of the content is stored locally.

Related

Incorrect directory structure created using jfrog CLI

I am trying to download the entire folder (as it is with all files and subfolders) from an Artifactory repo to my local folder.
Note - I am using Artifactory Pro cloud version
This how my Artifactory local repo (generic) looks like -
I run the following command in jfrog CLI (used this article as refrence) -
jfrog rt download --include-dirs=true --flat=true --user=XXX --password=XXX --url=https://XXXX.jfrog.io/XXXX --recursive '/support-pack/aem-dispatcher/files/(*)' '{1}'
The files get downloaded however it results in a weird folder structure -
Below is a screenshot of the logs
Notice the additional resume folder under the resume folder. Why is this happening?
I want the exact structure under files folder in Artifactory to be replicated in my local folder.
Please help!
The answer turned out to be much simpler than expected. All I had to do was get rid of --include-dirs=true in my command.
Read this article for more info.

Deploy debian package for multiple distros into Jfrog Artifactory?

What is the right procedure to deploy a debian package built for different distros into the same Jfrog debian artifactory repo?
Just uploading to the same path, but with different deb.distribution properties does not work, they all get uploaded to the same place and clobber the previous upload.
Including the distribution name into the package name is ugly, but would of course work. Is there a better way?
You simply post the different debian to different locations within the Jfrog artifactory repository. The trick is that the repository layout has nothing to do with the aptitude API, which retrieves debians regardless of their location according to the requested metadata (deb.distribution, deb.version etc).

Local alternate for Git with RStudio?

My colleague & I are starting on a R project, we both would be working simultaneously & interchangeable components of the model we are building. We can not use Git, as we do not want to put our code online, also it is not allowed by the organization. We also do not have a server of our own, what we have is some common shared drive. Is there a way, we can use a tool like Github/SVN completely locally, where both of us can push our code.
There are two options you can manage your R project with git repo.
Option 1: setup remote git repo in the shared directory
You can setup a remote git repo in the shared directory, and then add the remote repo as a remote for your local git repo, then you can push and push from the remote git repo. Detail steps as below:
First, in an empty folder of the shared directory (assume in \\share\path\gitrepo), execute:
git init --bare
Then add the remote repo as a remote for the local repo you are working.
Assume the local git repo (R project) is opened in R Studio, so you can add remote in R Studio terminal window or through git command line:
git remote add origin \\\\share\\path\\gitrepo
Note:
The count of slash \ in the remote repo url.
And the pull and push button is still disabled after adding remote repo since the local branch (maste) has not tracked the remote branch (origin/master).
Then you can commit changes and push to remote repo first time by:
git push -u origin master
After that (local master is tracking origin/master), the pull and push button will be enabled after refresh the git tool bar. And can pull/push by clicking the buttons afterwards.
Option 2: host the remote git repo to third-party private repo
If it’s ok for you to hosted your git repo to third-party, and do not let everyone has read permission, then you can create a private git repo in the third-party organization.
For bitbucket, it’s free to create private git repos, so you can host your git repo there.

Nexus3 OSS: Installing Multiple Instances on Windows

This is actually an informational post to show some undocumented abilities of the windows nexus.exe. If anyone wants to provide some other useful information about Nexus3 OSS installation and/or configuration on Windows that is not readily available in the online books/documentation, that would be great!
I needed to install Nexus3 along side our current Nexus2.x to take advantage of the new repository formats, but Nexus2.x is already running under the default service name of "nexus". How can I provide the service name that Nexus3 will run under on Windows?
Nexus 3 Documentation as of 2017/09/12
https://help.sonatype.com/display/NXRM3
The documentation now includes instructions to clarify how to install multiple instances for Nexus 3:
https://help.sonatype.com/display/NXRM3/Installation#Installation-RunningasaServiceonWindows
Nexus 2 Documentation as of 2017/09/12
https://help.sonatype.com/display/NXRM2/Repository+Manager+2
In order to install Nexus3 under a different service name on windows, you will need to use the archive(zip file) download for windows instead of the windows executable installer. Then you simply provide the name you want the service to have as the last parameter of the normal install command.
More specific instructions:
Unpack the zip file into the desired location.
Open a command prompt with elevated permissions (run as administrator) and navigate to the bin directory of the unpacked nexus3 folder.
To create the service:
nexus.exe /install YourUniqueServiceName
To remove the service:
nexus.exe /uninstall YourUniqueServiceName
Note: Creating/Removing the service this way will not delete the nexus installation nor data files; they only affect the windows service. Conversely the windows installer provides an uninstaller executable that will delete the installation files and optionally will also delete the data directory.
Here is some other useful configuration options for changing the port and the data directory location, which you will want to do when installing multiple instances (as in the use case above). Be sure to stop the service if you've already installed it.
To change the port:
Navigate to the "etc" directory under the nexus installation location, and open the "org.sonatype.nexus.cfg" configuration properties file.
Change the "application-port" property to the desired port value.
To change the Data Directory and/or the java.io temporary directory:
Add or modify the following commandline arguments to the "nexus.vmoptions" file in the bin directory:
-Dkaraf.data=InsertDesiredDataDirectoryHere
-Djava.io.tmpdir=InsertDesiredTmpDirectoryHere

Is there a URL for the latest snapshot for an artifact in Artifactory?

I would like to make a permalink to the latest snapshot version of an artifact in Artifactory. If we are on 1.0-SNAPSHOT, I would like a URL that downloads the latest 1.0-SNAPSHOT JAR. I can find the latest artifact by locating the artifact on our server at http://hostname/artifactory/libs-snapshot/groupId/artifactId/1.0-SNAPSHOT/. Other than checking the timestamps, I can figure out which one if the latest by opening maven-metadata.xml and matching metadata/versioning/snapshot timestamp and buildNumber with a JAR in the same directory. This could be scripted, but ideally Artifactory already has a way to construct a permalink in this manner. Does Artifactory provide such a URL?
Doing the normal query for the entry with artifactId-1.0-SNAPSHOT.jar in the URL name should return automatically the latest snapshot.
See the doc here
One thing: This is base either on the latest creation date if no pom present, or latest creation of the pom if there are some. Mixing pom and non-pom deployment may results in strange results!
I tried using shell script and it worked for me.
Step1: Get an encrypted password for your user account by clicking on user name or create a common user. Go to using your secure password section in the following link
http://www.jfrog.com/confluence/display/RTF/Centrally+Secure+Passwords
Step 2: In your local machine create a temp folder and type this curl(may be wget for windows) command:
curl -o tmp/foo.jar --user <username>:<encrypted_password> <artifactory_url>/list/libs-snapshot-local/com/search/foo/1.0/foo-1.0-SNAPSHOT.JAR
Your foo.jar in tmp folder is latest version. If we dont give timestamp as like above, it will download latest artifact in that version. Hope this helps!
This might be helpful:
How to download the latest artifact from Artifactory repository?
Although there is no permalink ability in the free version of Artifactory, it can be scripted easily as you suggest. I have provided a quick script to do that in the referenced question.
Hope it helps.
Another portable option is to use the maven command line:
mvn org.apache.maven.plugins:maven-dependency-plugin:2.4:get -DartifactId=[artifactId] -DgroupId=[groupId] -Dversion=[version] -Ddest=[dest file]
This works for me (no search API, just direct artifact URL):
curl -O -J --user <username>:<encrypted_password> http://hostname/artifactory/libs-snapshot/groupId/artifactId/1.0-SNAPSHOT/artifactId-1.0-SNAPSHOT.jar
Basically using 1.0-SNAPSHOT in the artifact name downloads the latest version of 1.0-SNAPSHOT snapshot.

Resources