I was wondering if there is a standard way to backup and restore a Nexus OSS 3 Artifact Repository.
Is it enough to backup the "data" directory and copy it to a running new instance?
I use different types of repositories - Maven, NPM, Docker etc.
In Nexus Repository Manager 3.2 there will be a supported Backup/Restore feature. Your approach seems fine, a few notes:
back up your data directory while Repository Manager is stopped (prevents db files being in an inconsistent state)
test your restoration procedure to ensure it works
If you run into issues, please file them here so we can be aware: https://issues.sonatype.org/browse/NEXUS
Related
I migrated from an artifactory 4.0.2 to a 7.last suing method 1 suggested by https://jfrog.com/knowledge-base/what-is-the-best-way-to-migrate-a-large-artifactory-instance-with-minimal-downtime/.
Now surfing into the new repository I see all definitions fof my repositories as well as all metadata, however I don't see any artifact content (e.g. jar files).
I copied all filestore content from the old to the instance.
What I missed ?
Tks
From the scenario you've described, it appears to be an issue with the storage (filestore) configuration/connection.
What do we need to check next?
If you have updated the filestore path as a part of the migration, update the path into the storage configuration file [binarystore.xml] in the Artifactory instance. Let's say, you have moved the data from one mount to another mount to use the data via the new Artifactory version 7.x, this step will help Artifactory to know where exactly it needs to look out for the binaries.
The file will be present under $JFROG_HOME/etc/artifactory/ directory in the Artifactory 7.x version.
I'm using JFrog Artifactory OSS in a docker container.
I want to download the latest version of an artifact. But it seems that it is not possible in OSS version.
Does anybody know a way to download the latest version of the artifact?
You are right, the Latest Version API endpoint works only in Artifactory Pro.
Working with Maven repositories, you can use the SNAPSHOT support to get Artifactory to return you the latest artifact.
Setting the Maven Snapshot behavior in repo settings to Unique, deploy the artifacts with -SNAPSHOT suffix. Artifactory will assign a unique version to those files internally, but you will always be able to retrieve the latest one using the -SNAPSHOT suffix.
Thanks a lot for your fast answer. I forgot you tell my hole workflow.
I have a jenkins server, building, testing and deploying my stuff.
I want build a spring boot jar file with maven and deploy it to my repository(i use jfrog). This works perfect. In a next step i will create a docker image containing this jar file. So the in the image file ther must be a command to download the execuatbel jar from jfrog. So for this reason i have to know the latest version of the jar file.
I hope you could understand it, this is my first english question.
Thanks a lot for helping me !
We're making use of a remote repository and are storing artifacts locally. However, we are running into a problem because of the fact the remote repository regularly rebuilds all artifacts that it hosts. In our current state, we update metadata (e.x. repodata/repomd.xml), but artifacts are not updated.
We have to continually clear our local remote-repository-cache out in order to allow it to download the rebuilt artifacts.
Is there any way we can configure artifactory to allow it to recache new artifacts as well as the new artifact metadata?
In our current state, the error we regularly run into is
https://artifactory/artifactory/remote-repo/some/path/package.rpm:
[Errno -1] Package does not match intended download.
Suggestion: run yum --enablerepo=artifactory-newrelic_infra-agent clean metadata
Unfortunately, there is no good answer to that. Artifacts under a version should be immutable; it's dependency management 101.
I'd put as much effort as possible to convince the team producing the artifacts to stop overriding versions. It's true that it might be sometimes cumbersome to change versions of dependencies in metadata, but there are ways around it (like resolving the latest patch during development, as supported in the semver spec), and in any way, that's not a good excuse.
If that's not possible, I'd look into enabling direct repository-to-client streaming (i.e. disabling artifact caching) to prevent the problem of stale artifacts.
Another solution might be cleaning up the cache using a user plugin or a script using JFrog CLI once you learn about newer artifacts being published in the remote repository.
So I have a lot of websites, 150+. Starting with the bigger sites I am beginning to set up git repositories for tracking the changes to these sites. I can create a localserver version of a site and set up the repository and everything is running fine.
I have set up a .gitignore file to ignore all the core files and plugin folders etc. Again this is fine, the files are still on my local machine and have been deleted from my repository.
What I want to do is set up this repository on multiple computers (my colleagues who do less development work but will still need access to the repository). I imagine cloning won't work as all the core files are no longer in repository. How do I get around this?
Thanks all!
EDIT:
I should have mentioned we're using BitBucket to act as a central repository if that makes any difference.
There are few ways you can do that.
You can set local environment in one location, and keep git repository in other location.
After cloning or pulling the repository you can then run script which will copy the files from repository to the local environment.
You can add all files to the repository ignoring only var/, .htaccess, app/etc/local.xml and .gitignore. Bare in mind that you can break a website by changing files which should not have been changed. Debugging then becomes a nightmare. Having all in git, you know instantly what went wrong.
We've managed to set up great workflow using beanstalk.com. They've got option to share repositories (like github) and then deploying them on different server through SSH. Works like a charm - highly recommended.
I want to move a set of artifacts from one Nexus to other (download and later upload). I just can download the artifacts one by one, ¿Is there any way to download the entire folder? ¿Is there any other kind of operation like export/import?
Thanks!
EDIT:
I have access to the nexus repository (sonatype-work\nexus\storage) in the user folder. I have got from there all the artifacts. I didn't find any way to do it from the web client.
Nexus stores the artifacts on disk in standard Maven 2/3 repository layout, so you can just directly copy the artifacts from one storage directory to the other using whatever means you like.
After you're finished, schedule a repair index task against the destination repository so that searching for artifacts from the web UI will work. Note that your builds will work immediately after you copy the artifacts, indexes are not used by maven builds.