I am using JFrog CLI (jfrog rt download) to download build reports from Artifactory that are published there by GitLab CI in unpacked state in order to allow unhindered html reports browsing.
However it takes extremely long (10-20 minutes) because of just how many small files there are.
I see that Artifactory has REST API to download whole repository folder content in one swoop as a single archive.
But I am not able to find any way to do the same using JFrog CLI.
Am I missing something or is there truly no way to download whole folder content as an archive using JFrog CLI?
P.S.: I am aware that there is a configuration option on Artifactory that supposedly allows to browse contents of archives, but there are reasons (organizational and technical) preventing me from using it
Using the CLI you can increase the "--threads" value. I have seen a massive improvement when downloading a directory with lots of small files when increasing the number of threads.
Related
Is there any option that allows uploading already existing artifacts without getting the error Not enough permissions to delete/overwrite artifact?
I know that I could give the "Delete" permission to my uploading user but this is not what I want since I don't want artifacts to be overwritten. For example, the python package uploader twine has such an option --skip-existing as well.
Related StackOverflow posts:
How to avoid overriding the existing files while uploading to Artifactory using JFrog rt CLI
How can I prevent previously deployed artifacts from being overwritten?
I already created a issue on GitHub for my question as well.
Hi I am trying to find a simple solution to run a static security scan on binaries stored in JFrog Artifactory. It looks like the veracode integration supports Artifactory 6.7.8 https://community.veracode.com/s/article/Support-Matrix. Has anyone used this plugin with newer versions of Artifactory? If so how did you add the plugin to Artifactory? I'm trying to find a simple way to add the integration.
I was able to do this by using rtUpload and rtDownload jenkins plugins. https://www.jfrog.com/confluence/display/JFROG/Declarative+Pipeline+Syntax.
Make a repo for the binaries
Write a function for uploading the binaries to artifactory
Write a function for downloading the binaries from artifactroy
How can I copy a specific artifact to another instance of Artifactory?
From what I can see the export/import functionality only works for full system or full single repo copying. I don't want to replicate the full repo either. I just want to copy specific artifacts.
Have you tried using the JFrog CLI? It can be used to download artifacts from Artifactory by giving a specific pattern. So, for example, you can download only the "war" files from a specific repository and then import it / deploy it to the rest of the instances that you want. You can also write a script using the JFrog CLI that will download and then publish those artifacts to the other Artifactory instances using the CLI.
https://www.jfrog.com/getcli/
You can use CLI for Jfrog Artifactory for copying the file with the new name to another repo:
jfrog rt cp "your-artifactory-repo/artifact.extension" your-new-artifactory-repo/artifact.extension
Note: you can use * if you want to copy all the artifacts from the folder.
I'm wondering how other Artifactory Admins do that so here's my question:
We're starting to use Artifactory to manage our artifacts. Internal as well as external artifacts. The external artifacts are all available in an internal repository. This is so because of a conversion from a file based repository to Artifactory.
Now this is starting to cause issues and I'm wondering how others are managing the external dependencies? As an Artifactory Administrator I want to be sure that my developers only use artifacts which have the correct license so I don't want to have a "feel free to download everything from the internet" culture.
I want to provide some sort of a "whitelisted and approved" set of external Artifacts.
Is this possible using Artifactory OSS or do we manually download the artifacts from a remote repository and deploy it to our local repository?
Thank you in advance!
this can be done with writing a user plugin but it will require a PRO version of Artifactory. You can see here examples to a governance control plugin that was written in the past.
With OSS version you can't reject downloads of users based on license.
Hope that answer your question.
So I have a lot of websites, 150+. Starting with the bigger sites I am beginning to set up git repositories for tracking the changes to these sites. I can create a localserver version of a site and set up the repository and everything is running fine.
I have set up a .gitignore file to ignore all the core files and plugin folders etc. Again this is fine, the files are still on my local machine and have been deleted from my repository.
What I want to do is set up this repository on multiple computers (my colleagues who do less development work but will still need access to the repository). I imagine cloning won't work as all the core files are no longer in repository. How do I get around this?
Thanks all!
EDIT:
I should have mentioned we're using BitBucket to act as a central repository if that makes any difference.
There are few ways you can do that.
You can set local environment in one location, and keep git repository in other location.
After cloning or pulling the repository you can then run script which will copy the files from repository to the local environment.
You can add all files to the repository ignoring only var/, .htaccess, app/etc/local.xml and .gitignore. Bare in mind that you can break a website by changing files which should not have been changed. Debugging then becomes a nightmare. Having all in git, you know instantly what went wrong.
We've managed to set up great workflow using beanstalk.com. They've got option to share repositories (like github) and then deploying them on different server through SSH. Works like a charm - highly recommended.