BitBucket Server API Call For Deleting and/or Renaming Content - bitbucket-api

Looking here for an API endpoint to delete and/or rename a piece of content in a repository. I don't see anything relevant. How does one do this?
https://developer.atlassian.com/bitbucket/api/2/reference/

Unfortunately, in May 2017 Atlassian stated that it is not supported.
When looking at the Bitbucket REST API of version 5.10.1 in June 2018, it is still not supported.
There is a files endpoint at /rest/api/1.0/projects/{projectKey}/repos/{repositorySlug}/files, but it only has a GET endpoint to list the files in a specific directory of the repository.
There also is a /rest/api/1.0/projects/{projectKey}/repos/{repositorySlug}/browse/{path:.*} endpoint. It supports GET to list the files in a directory of a repository and supports PUT to commit one file per call. However, DELETE is not supported on that endpoint.
The same goes for renames. The documentation does not mention the ability to do so with a REST API call.

/2.0/repositories/{username}/{repo_slug}/src This API can be used to update or delete files.
From docs
To create a commit that deletes files, use the files parameter:
$ curl https://api.bitbucket.org/2.0/repositories/username/slug/src \
-F files=/file/to/delete/1.txt \ -F files=/file/to/delete/2.txt
You can add/modify/delete multiple files in a request. Rename/move a
file by deleting the old path and adding the content at the new path.

Related

Grab files changed by commit from GitLab API

I am trying to use the GitLab API to grab all files changed by a particular commit, but I don't see a good way of doing it.
In other words, whats the equivalent of git show --name-only sha1 in the GitLab API?
The only way I know of is using the following API Call.
GET /projects/:id/repository/commits/:sha/diff
(See: https://docs.gitlab.com/ee/api/commits.html#get-the-diff-of-a-commit)
If you only need the changed files, you could of course only look at the file Paths (old_path, new_path) in the Response

How to get pull request by id using JGit

I've started using JGit for achieving git api calls through java based jGit lib.
I'm looking to get stats around what has changed in pullRequest in files for that I'll need to get details of pullRequest.
I'm facing challenges in finding approach in getting pull request by ID using JGit.
There is no clear and concise documentation, the cook-book also don't cover this portion.
PullRequests are not part of core Git, so JGit also does not have support.
So it depends on which Git server you are accessing, GitHub has support via special branches, e.g.
git fetch origin pull/ID/head:BRANCHNAME
e.g.
git fetch origin pull/223/head:local_test_branch
A similar fetch-command via JGit should work as well.
See https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/checking-out-pull-requests-locally for details

Copy latest artifact from one path to another

I'm trying to copy the latest artifact from one path to another using Artifactory API.
POST /api/copy/{srcRepoKey}/{srcFilePath}?to=/{targetRepoKey}/{targetFilePath}[&dry=1][&suppressLayouts=0/1(default)][&failFast=0/1]
Let's say I have a few RPMs named: artifact-1.0-1.rpm, artifact-1.0-2.rpm and artifact-1.0-3.rpm.
How to automatically copy the third artifact ?
With the next release of Jfrog's CLI, planned in a couple of weeks, you'll be able to use SORT and LIMIT in the COPY command.
This will allow you to fetch only the latest item\artifact by SORTing by date and LIMITing to the result set to 1.
For now, you can use 2 sequential CURL commands to try and accomplish what you're after:
First use an AQL SEARCH with you're SORT and LIMIT to retrieve the relevant item's path, and then use your COPY command with that path.
Note: the CLI's SORT and LIMIT feature has already been checked in to the CLI's dev branch, so if you wish to use a snapshot you can "download and build" the dev branch from github, and then test if the solution suites you.
I doubt that you can automatically copy all these artifacts in one statement. You can copy the folder but no regex or pattern can defined in copy command.

How can I fetch the build versions from Artifactory repository

I have to fetch the list of builds present here : https://openmrs.jfrog.io/openmrs/public/org/openmrs/api/openmrs-api/
I am new to artifactory and its api. I need to know the curl command for fetching all the version numbers mentioned there please? How do I go about it? It's worthwhile to know that the repo doesn't belong to me and hence I do not have the username and password for it(Incase that's needed).
In the artifactory rest API documentation they use localhost for this purpose and do not have such a link. I haven't set this repo up so I do not know how it's done. Basically I am new to this and hence any help would be appreciated.
Thanks
Assuming you would like to get the list of versions available in this folder, there are 2 possible options:
1) Use the folder info REST API method to get a list of all the sub-folders
curl https://openmrs.jfrog.io/openmrs/api/storage/public/org/openmrs/api/openmrs-api/
2) Download and parse the maven-metadata.xml file inside this folder. This file contains information about available versions.

How to migrate data & settings from one firebase to another?

Are there any tools to aid in data migration from dev to staging to prod? If not, are there plans to build them?
I know you can Export JSON and Import JSON from Forge, but that doesn't include authorization and security settings.
All of our data is available through a REST API, so you could easily write a script to do this yourself. You can export the data by setting format=export (this includes all of the priority data in the response):
curl https://myapp.firebaseIO.com/.json?format=export&auth=YOUR_FIREBASE_SECRET
As for exporting the security rules, you can access them here:
curl https://myapp.firebaseIO.com/.settings/rules/.json?auth=YOUR_FIREBASE_SECRET
You can then write them back to the new Firebase using PUT.
The various Auth settings can't easily be automatically transferred (such as the Authorized Origins), but they probably shouldn't be as they'll differ between staging and production.
What Andrew said above is mostly correct, however this can be a pain with large firebases.
There is an import project at https://github.com/firebase/firebase-import that will help import large firebases by breaking up the put requests.
Also something to note, you will need to use quotes around the curl url, otherwise the & will background the process. So what Andrew gave above will work instead as
curl -o outputfile.json "https://myapp.firebaseIO.com/.json?format=export&auth=YOUR_FIREBASE_SECRET"
Then you can use the import module I linked with that json file.
Good Luck!
If you want an option that doesn't require cURL, and you have the firebase-tools project installed, you can run this:
firebase database:get --export -o backup.json /
Note that this should be run from a working directory configured as a Firebase project. The advantage of this option is it will use the Auth you've set up for that project, so you don't need to hard-code auth keys into command lines (for the security-conscious) and it doesn't rely on the deprecated auth-key pattern.
Command-line Fu: Another cool technique if you want separate files for each top-level key is calling:
for i in `firebase database:get --shallow / | jq -r 'keys[]'`; do
echo "Downloading $i..."
firebase database:get --export -o $i.json /$i
done
You will need the "jq" tool installed for this to work. Exporting each collection separately can be really useful if you later want to restore or work with just a portion of your data.
Firebase is working on a new service "S3 Customer Backups" that will copy a .gz compressed backup of your entire firebase nightly into an s3 bucket you give them. I'm evaluating the beta of this service right now, but if it is something you need, I recommend asking support about it.
Our firebase got too large for the curl operation to complete, and this new solution will enable us to manage our dev environments. So if you have a large firebase, setup the S3 Customer Backups then use firebase-import to shove the data into your dev/staging firebases. Victory!
I just created this ruby gem for cloning a firebase remote config data from an existing project o a new one project.

Resources