i want to add attachment file from my computer in alfresco? - alfresco

enter image description herewhile clicking attachments select button in adding new item in custom data list, I want to add files from windows explorer in client side instead of from server.
abve image is adding attachments from server i dont want that, i want to add files from windows explorer in client pc.

Alfresco 5.1: POST /alfresco/service/api/upload. In this version the new REST API doesn't include methods for nodes.
Alfresco 5.2 EA: the new REST endpoint POST /nodes/{nodeId}/children, that supports file upload using multipart/form-data.
curl -X POST
--header 'Content-Type: application/json'
--header 'Accept: application/json'
--header 'Authorization: Basic YWRtaW46YWRtaW4='
-d '{"name":"my-new-file"}'
'http://localhost:8080/alfresco/api/-default-/public/alfresco/versions/1/nodes/parent-node-id/children'

Related

API Filter in Visual Studio Code keeps being ignored and requests for everything

I am doing some API integration and im just testing out Celoxis API commands, they use REST API. I have downloaded Visual studio code and installed the rest API add on. Normal cURL commands work etc but the filter request wont:
curl -g -X GET -H 'Content-Type: application/json' -H 'Authorization: bearer YourTokenHere' -G --data-urlencode 'filter={"project.id":"1234"}' 'https://app.celoxis.com/psa/api/v2/tasks'
Any help would be appreciated

Perform system import/export command line artifactory

I need to know how can I perform a sysyem export/import by command line by artifactory.
Is there any command that will do this task ?
My version of artifactory is 7.29
You can make use of the Import/Export REST API's for this purpose, like below,
Export full system:
curl -H 'Content-Type: application/json' -X POST https://myartifactory/api/export/system -d #export.json
Import full system:
curl -H 'Content-Type: application/json' -X POST https://myartifactory/api/import/system -d #export.json
Also, refer to this KB article on migrating the Arifactory via system import/export and replacing the UI operations with the REST APIs.

How to send a file in a url via curl?

I have an image in a URL like https://i.postimg.cc/GpDskmSG/sdssds.png and I want to send it over to be stored at nft.storage. Could I send it directly as a URL instead of a file from a path? I tried the below but it only stores the image URL.
curl -H "Authorization: Bearer eyJhbGciOiJIwetertyrtyUzI1NiIsInR5cCI6Ikp" -H "Content-Type: image/png" --data "https://i.postimg.cc/GpDskmSG/sdssds.png" --url "https://api.nft.storage/upload"
P/S: I'm testing on windows curl.
According to the API docs (https://nft.storage/api-docs/), it will accept multipart/form-data for this type of POST.
You can try using -F/--form instead, for which a good example exists on the curl man page (https://linux.die.net/man/1/curl):
curl -F "web=#index.html;type=text/html" url.com

How to call API Hooks or Call back function from JIRA to DocuSign and revert from DocuSign to JIRA

Hi guys I am working on one service desk of JIRA, From here I need to pass some information to DocuSign when a button is clicked and a PDF will be generated to DocuSign and it will revert that PDF file to JIRA and that file will be linked to JIRA. If these didn't work directly then any intermediate code in ASP.NET/MVC/Core will be great.
Thank in advance.
You can use Python in between. You need to setup a webhook out of Jira to airflow for example.
Webhook Configuration in your Jira like:
curl -X POST \\
https://{airflow.SERVER}/api/v1/dags/{dag_name}/dagRuns \\
-H 'Cache-Control: no-cache' \\
-H 'Content-Type: application/json' \\
-H 'Authorization: Basic *****************' \\
-d '{"conf":{"ticket":"{{issue.key}}"}}'
Then host your code there and set up the connection to Jira and your DocuSign.
The code could look like the following:
atc_conn = JiraHook("jira").get_conn()
jql = "project =...."
issues = atc_conn.search_issues(jql, fields=["attachment"])
for issue in issues:
attachments = issue.fields.attachment
for attachment in attachments:
attachments_to_sign = attachment.get()
....

How do I access Firestore using CURL with an API Key and service account token?

I am trying to access my Firestore database using cURL from a terminal session. I have read through the REST API documentation for Firestore and the Authentication documentation for authenticating Oauth and services accounts. I have set up a services accounts and IAM roles in API dashboard. I cannot determine from the documentation what the correct path and syntax and what do use for the API Key and the BEARER token. For example, I am trying to receive a json response for the USER xyz, document field FNAME that is stored in a Firestore DATABASE (note - where do i find the the databaseID?) that is in PROJECT testproject.
Here is the CURL command lists in the documentation -
curl \
'https://firestore.googleapis.com/v1beta1/%5BNAME%5D?key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--compressed
curl --request POST \
'https://firestore.googleapis.com/v1beta2/%5BNAME%5D:exportDocuments?key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--header 'Content-Type: application/json' \
--data '{}' \
--compressed
Questions are - what do I use for the [YOUR_API_KEY] ?
What do I use for the [YOUR ACCESS TOKEN] -
I have tried the following from credentials for a Service account that I set up
Service account - Key - 3......................e76
Unique ID - 1............39
for the API KEY and the ACCESS TOKEN and get a 403 error back
I also have a Oauth credentials -
Client ID - 2.....113-95.......cpqrarqb.....qnrpc.apps.googleusercontent.com
Client Secret - L......lq
PATH
https://firestore.googleapis.com/v1/projects/{project_id}/databases/{database_id}/collectionGroups/{collectionId}/fields/{field_id}
Which didn't work either...
Again, I am trying to access and read and write data to my Firestore database using CURL - as a proxy for what will be my REST API's. Any help and assistance much appreciated.
From the curl commands you have pasted I understand that you want to export your firestore collections to a Cloud Storage bucket. Furthermore I understand you obtained the curl commands from the api explorer of the exports method.
To provide an api key value to [YOUR_API_KEY], you first need to create an api key in your GCP project; here is the process:
Go to the credentials section.
Click on the option at the top called 'Create Credentials'.
Select API key.
Copy and keep safe the value thrown by the Cloud Console (this is your api key).
If you want to know more about API keys, you can visit this.
To provide an oauth token value, you can do the following:
You can open Cloud Shell.
Run command gcloud auth application-default print-access-token.
Copy and keep safe the value thrown by Cloud Shell (this is your oauth token).
Please note that there are several ways to create an oauth token but the one I specified is the fastest one. You may also use the oauth playground to generate your token; keep in mind that the token is valid for 60 minutes.
As per the database id I have used (default) and here I include my curl statement:
curl --request POST \
'https://firestore.googleapis.com/v1/projects/[PROJECT_ID]/databases/(default):exportDocuments?key=[YOUR_API_KEY]' \
--header 'Authorization: Bearer [YOUR_ACCESS_TOKEN]' \
--header 'Accept: application/json' \
--header 'Content-Type: application/json' \
--data '{"collectionIds":["users"],"outputUriPrefix":"gs://[BUCKET_PATH]"}' \
--compressed

Resources