I migrated from an artifactory 4.0.2 to a 7.last suing method 1 suggested by https://jfrog.com/knowledge-base/what-is-the-best-way-to-migrate-a-large-artifactory-instance-with-minimal-downtime/.
Now surfing into the new repository I see all definitions fof my repositories as well as all metadata, however I don't see any artifact content (e.g. jar files).
I copied all filestore content from the old to the instance.
What I missed ?
Tks
From the scenario you've described, it appears to be an issue with the storage (filestore) configuration/connection.
What do we need to check next?
If you have updated the filestore path as a part of the migration, update the path into the storage configuration file [binarystore.xml] in the Artifactory instance. Let's say, you have moved the data from one mount to another mount to use the data via the new Artifactory version 7.x, this step will help Artifactory to know where exactly it needs to look out for the binaries.
The file will be present under $JFROG_HOME/etc/artifactory/ directory in the Artifactory 7.x version.
Related
I'm using JFrog Artifactory OSS in a docker container.
I want to download the latest version of an artifact. But it seems that it is not possible in OSS version.
Does anybody know a way to download the latest version of the artifact?
You are right, the Latest Version API endpoint works only in Artifactory Pro.
Working with Maven repositories, you can use the SNAPSHOT support to get Artifactory to return you the latest artifact.
Setting the Maven Snapshot behavior in repo settings to Unique, deploy the artifacts with -SNAPSHOT suffix. Artifactory will assign a unique version to those files internally, but you will always be able to retrieve the latest one using the -SNAPSHOT suffix.
Thanks a lot for your fast answer. I forgot you tell my hole workflow.
I have a jenkins server, building, testing and deploying my stuff.
I want build a spring boot jar file with maven and deploy it to my repository(i use jfrog). This works perfect. In a next step i will create a docker image containing this jar file. So the in the image file ther must be a command to download the execuatbel jar from jfrog. So for this reason i have to know the latest version of the jar file.
I hope you could understand it, this is my first english question.
Thanks a lot for helping me !
I'm using artifactory (OSS 5.1.3) as a general build dependency cache. I've noticed that in the repository browser, for each remote repository there is a second entry with -cache appended. ex: "jcenter" and "jcenter-cache".
The -cache entries are created automatically. After I added a generic "gradle-distributions" repository to cache https://services.gradle.org/distributions/, I found that I had a "gradle-distributions-cache" repository in the tree as well. The -cache has a different icon, but it's not listed under any of the different repository types in the admin area, and it's not selectable as a source when defining a virtual repository.
Once I've downloaded an artifact once, I can access it through either the main repository name or the -cache name. But if I haven't downloaded something yet, then the -cache name will 404 (while the main name will go out and fetch it).
I couldn't find anything in the settings or documentation to explain the -cache repository. It's useful as a way of seeing what artifactory has already downloaded from the remote, but is there another explanation for it that I'm not apprehending? Is there a reason to point to one name or another in direct urls? (ex: gradle wrapper --gradle-version 3.4.1 --gradle-distribution-url http://localhost:8081/artifactory/gradle-distributions/gradle-3.4.1-bin.zip) This is mainly a curiosity question.
The "-cahce" repositories are mentioned in the remote repositories configuration section.
The idea is that in some cases it is useful to directly access artifacts that are already stored in the cache (for example to avoid remote update checks).
I was wondering if there is a standard way to backup and restore a Nexus OSS 3 Artifact Repository.
Is it enough to backup the "data" directory and copy it to a running new instance?
I use different types of repositories - Maven, NPM, Docker etc.
In Nexus Repository Manager 3.2 there will be a supported Backup/Restore feature. Your approach seems fine, a few notes:
back up your data directory while Repository Manager is stopped (prevents db files being in an inconsistent state)
test your restoration procedure to ensure it works
If you run into issues, please file them here so we can be aware: https://issues.sonatype.org/browse/NEXUS
So I have a lot of websites, 150+. Starting with the bigger sites I am beginning to set up git repositories for tracking the changes to these sites. I can create a localserver version of a site and set up the repository and everything is running fine.
I have set up a .gitignore file to ignore all the core files and plugin folders etc. Again this is fine, the files are still on my local machine and have been deleted from my repository.
What I want to do is set up this repository on multiple computers (my colleagues who do less development work but will still need access to the repository). I imagine cloning won't work as all the core files are no longer in repository. How do I get around this?
Thanks all!
EDIT:
I should have mentioned we're using BitBucket to act as a central repository if that makes any difference.
There are few ways you can do that.
You can set local environment in one location, and keep git repository in other location.
After cloning or pulling the repository you can then run script which will copy the files from repository to the local environment.
You can add all files to the repository ignoring only var/, .htaccess, app/etc/local.xml and .gitignore. Bare in mind that you can break a website by changing files which should not have been changed. Debugging then becomes a nightmare. Having all in git, you know instantly what went wrong.
We've managed to set up great workflow using beanstalk.com. They've got option to share repositories (like github) and then deploying them on different server through SSH. Works like a charm - highly recommended.
I want to migrate my 20 year old maven repo to artifactory. What are my different possible options for migration starting with the easiest or the quick way to migrate and start using artifactory?
One way I could think was a remote repo in artifactory that would proxy to my existing repo.
The solution you've already suggested (passive proxying) is seamless and useful for weeding out artifacts that are no longer needed/used, but will also take you more time to migrate away from the old instance.
For a more direct approach, Artifactory has a repository import facility that allows you to import content into Artifactory from a physical location on the installed machine.
A little less convenient but also an option is to write a script that crawls the content directory of the old repo and deploys every artifact using an HTTP PUT request.