Is there a way to store my app's provisioning profiles along the code in my git repository and make Bitrise install them on runtime? This instead of manually uploading the profiles to Bitrise.
It is just a matter of copying the *.mobileprovision files to the proper user path.
cp ./Provisioning/*.mobileprovision ~/Library/MobileDevice/Provisioning\ Profiles/
Solution based on https://stackoverflow.com/a/40728419/1049134
Why not create a repository using Fastlane Match approach and use Fastlane Match Integration so that you can manage certificates and profiles for not only yourself, but also for your team as well?
Related
How can I copy a specific artifact to another instance of Artifactory?
From what I can see the export/import functionality only works for full system or full single repo copying. I don't want to replicate the full repo either. I just want to copy specific artifacts.
Have you tried using the JFrog CLI? It can be used to download artifacts from Artifactory by giving a specific pattern. So, for example, you can download only the "war" files from a specific repository and then import it / deploy it to the rest of the instances that you want. You can also write a script using the JFrog CLI that will download and then publish those artifacts to the other Artifactory instances using the CLI.
https://www.jfrog.com/getcli/
You can use CLI for Jfrog Artifactory for copying the file with the new name to another repo:
jfrog rt cp "your-artifactory-repo/artifact.extension" your-new-artifactory-repo/artifact.extension
Note: you can use * if you want to copy all the artifacts from the folder.
I am looking for a solution that would allow me to have a network share where people can access (read-only) the artifacts from an Artifactory repository.
Why? We use Artifactory to also keep track of big binaries like installation kits, ISO images and so on and it takes a lot of time to download all of them (sometimes as zips), unpack and run them. If these would be exported to a NFS/SMB share people would be able to only mount them in order to use them.
How can we achieve this? Please keep in mind that we also want to automate this, so the files would be updated by Artifactory when needed.
Artifactory supports WebDAV out of the box.
It seems that's not possible at this moment and there is a feature request for enabling it:
https://www.jfrog.com/jira/browse/RTFACT-8302
Feel free to vote and to comment on it, allowing jFrog to realise how important is this use case.
I guess they should be able to provide a script that does mirror/sync a repository to a NFS share but that would almost double the storage space needed.
Instead if they would use hardlinks or symlinks to create a browsable tree of the repository inside the storage directory, this would be solved and no sync will be needed.
As part of my custom promotion process I want to get rid of older builds of the SNAPSHOT versions. When I trigger a build delete using the artifactory rest API using the 'artifacts=1' flag the artifacts itself are deleted successfully, but the according directories where the artifacts have been stored are still there. Is there a convenient way to delete those empty directories too during the build delete operation?
cheers,
René
The directories aren't deleted by default. You can write a user plugin to delete empty folders as afterDelete trigger. Here's an example: https://github.com/JFrogDev/artifactory-user-plugins/blob/master/cleanup/deleteEmptyDirs/deleteEmptyDirs.groovy
Problem description:
I have multiple alfresco installations (development, testing, production) of one project.
I need to copy files under Data Dictionary folder (Scripts, Templates, Web Scripts) from one to another in one direction (development -> testing -> production).
Current solution:
I copy files manually via webdav, which is annoying and unreliable (I can forget to copy some.).
Desired solution:
I'd like to have I tool, which will copy changed files at my command, what they are ready for the next step. I had an idea, that it could internally use a Git repository with branches for each installation, being able to fetch the files from devel and push the files to testing and production. This way (with Git) it could also support reverting changes.
It looks like a quite common problem, but I wasn't able to google something about it, so I'm asking here. Does such a tool exist or is there a better way of managing multiple repositories?
If you have a brand new installation of your development/testing/production Alfresco instances, you could simply migrate alf_data dir content, that contains by default db, indexes, content-store, backup files. If you need, you could migrate the "shared" folder too, or at least some files from the shared folder as could be some Alfresco customization (custom scripts or similar). Here is the link that helps with migration steps:
http://wiki.alfresco.com/wiki/System_Migration
Otherwise, if you need only to move a folder from Data Dictionary, or a set of documents, you could use ACP in order to achieve that. Here is the wiki for doing this: http://wiki.alfresco.com/wiki/Export_and_Import
You could do this via FTP. When your want to deploy new changes, you can go with manual client like FileZila to download changes from Dev, then upload them to test.
But you can also automate FTP, so that it can run a scheduled check if there are new things on, say, dev and push them to test.
If you use Git for source control, you could also do this via git-ftp. Hold a copy of Data Dictionary in your source folder, then add some sort of pre-commit check, which will see if you changed any of those files. If you did, on commit it will push the change to dev and test.
I think Relication service AF is suitable for you.
http://wiki.alfresco.com/wiki/Alfresco_Community_3.4.a#Replication
I love github and RStudio for workflow. Recently, I've created a project template that makes directories and scripts etc. and would like to create locally and push to github.
In the past I created a repo for a project via https://github.com/ used version control in RStudio to create the local repo and then dump all files I already had there.
This seems wasteful of time. How can one to take the directory/repo that's already in RStudio with a .Rproj file and upload to github with out first creating the shell repo at https://github.com/?
I think this could save time in the workflow.
I thought I could just follow the directions -here- (under Adding version control to a project) to add version control but this doesn't allow me to push to github (nor should it because how does RStudio know which git site you want to push to).
The only way you could create a repository on github directly from your computer, without having to create it with their website first, would be to create a remote branch directly from git on your system. This is possible on some git installation, but not on Github.
However, Github provides an API that allows to create the repository from the command line, via a call to curl for example. You will find information on how to do it in this answer (not tested) :
curl -u 'USER:PASS' https://api.github.com/user/repos -d '{"name":"REPO"}'
git remote add origin git#github.com:USER/REPO.git
git push origin master
But I don't think you will be able to do it directly from RStudio : you will need to put your project under version control, and then to execute the three commands provided in the answer in a shell.
Have you seen hub?
hub create
git push -u origin master
will do the job for you, once hub is configured to access your GitHub account. If you want the project to be called different from the name of the parent directory, use
hub create projectname
The general usage is
hub create [NAME] [-p] [-d DESCRIPTION] [-h HOMEPAGE]
(-p -- private repository), and you can access many more GitHub features with this tool.