I have an artifactory server with a connan repository.
Currently I have a CI system that pushes to a single conan channel acme/stable
conan upload base64/1.0.0-2#acme/stable
How would you create a second channel and restrict artifactory permissions so not all users could write to the stable channel?
Create a new artifactory permission that can write to the conan-local repo
Remove the Include Pattern **
Create an Exclude Pattern that matches acme/**/**/stable/**
Now users will be able to upload their own test packages, but not be able to overwrite the stable channel.
conan upload base64/1.0.0-2#acme/stable -r artifactory
Related
How to disable re-publish of the same version of the artifact in the Jfrog Artifactory?
For example Docker repository - my-docker-releases: Image myapp:1.0.0 is published in the repository. It should not be possible for anyone to publish another image with the same tag myapp:1.0.0 in the same repository.
How to achieve this feature by repository settings of Artifactory?
As mentioned in their documentation:
https://www.jfrog.com/confluence/display/RTF6X/Managing+Permissions
Preventing Overwriting Deployments
You can prevent a user or group from overwriting a deployed release or unique snapshot by not granting the Delete permission. Non-unique snapshots can always be overwritten (provided the Deploy permission is granted).
Is there a way to store my app's provisioning profiles along the code in my git repository and make Bitrise install them on runtime? This instead of manually uploading the profiles to Bitrise.
It is just a matter of copying the *.mobileprovision files to the proper user path.
cp ./Provisioning/*.mobileprovision ~/Library/MobileDevice/Provisioning\ Profiles/
Solution based on https://stackoverflow.com/a/40728419/1049134
Why not create a repository using Fastlane Match approach and use Fastlane Match Integration so that you can manage certificates and profiles for not only yourself, but also for your team as well?
I've installed the DAI community app on an AWS instance following these instructions:
http://docs.h2o.ai/driverless-ai/latest-stable/docs/userguide/install/aws.html
https://localhost:12345 gives me the login screen. The standard credentials "h2oai" and "h2oai" return "Invalid User Id or Password". What are the credentials?
Also, DAI installed to /opt. I do not see the /data and /license folders anywhere. Where are they?
I expect that you are probably using the AWS Marketplace image and not the community ami. The AWS Marketplace image is password protected in order to observe AWS Marketplace standards. The username should be 'h2oai' and the password would be the AWS Instance ID.
With respect to the folders, the AWS Marketplace image uses our Debian install package. See documentation here: http://docs.h2o.ai/driverless-ai/latest-stable/docs/userguide/install/linux-deb.html for information about which folders are available where. The software will be installed into /opt/h2oai/dai, and you will find the log folder there, but it is better for you to access the logs and such via systemctl and journalctl.
NOTE: additional configurations folder can be found at /etc/dai
A repository I use regularly deletes old version of deb packages as it releases new ones, which is very annoying as it breaks our builds until we can bump the package version in our config management.
I was hoping that the Remote Repositories feature would let me create a cache that keeps packages in it even though the original repository has deleted them.
Does Artifactory Remote Repositories delete files when they are deleted from the original repository?
Thanks
The answer is NO, Artifactory will not delete the files if the remote repository deleted the files. Now there is a scenario that Artifactory will clean unused artifacts BUT only if the Admin choose to and I will explain. In the Artifactory UI --> Admin --> Remote repositories --> Repository configuration --> Advanced tab, you have a checkbox named "Unused Artifacts Cleanup Period". By default, this field is empty which means no cleanup. Unless the Admin change this field nothing should be deleted automatically by any Artifactory process.
How can I copy a specific artifact to another instance of Artifactory?
From what I can see the export/import functionality only works for full system or full single repo copying. I don't want to replicate the full repo either. I just want to copy specific artifacts.
Have you tried using the JFrog CLI? It can be used to download artifacts from Artifactory by giving a specific pattern. So, for example, you can download only the "war" files from a specific repository and then import it / deploy it to the rest of the instances that you want. You can also write a script using the JFrog CLI that will download and then publish those artifacts to the other Artifactory instances using the CLI.
https://www.jfrog.com/getcli/
You can use CLI for Jfrog Artifactory for copying the file with the new name to another repo:
jfrog rt cp "your-artifactory-repo/artifact.extension" your-new-artifactory-repo/artifact.extension
Note: you can use * if you want to copy all the artifacts from the folder.