ConvertApi Post Error - convertapi

We are trying to merge multiple images into single pdf document. And convertapi seemed the perfect solution for us.
However when I tried a sample with using your api, it returns
{
"Code": 5999,
"Message": "The error mapping is missing."
}
And my request is in below, Can you point me to right direction please:
Url: https://v2.convertapi.com/pdf/to/merge?Secret=XXXXXXXXXXXXXXXX
Post Data:
{
"Parameters": [
{
"Name": "StoreFile",
"Value": "true"
},
{
"Name": "PdfVersion",
"Value": "1.7"
},
{
"Name": "Files",
"FileValues": [
{
"Url": "https://vignette.wikia.nocookie.net/gameofthrones/images/5/52/Viserion_brought_down.jpg"
},
{
"Url": "https://vignette.wikia.nocookie.net/gameofthrones/images/5/52/Viserion_brought_down.jpg"
},
{
"Url": "https://vignette.wikia.nocookie.net/gameofthrones/images/5/52/Viserion_brought_down.jpg"
}
]
}
]
}

Conversions can be chained by using result file link as source file in successive conversion. Result file URL will be treated exceptionally and conversion will be processed in high performance. Example:
Convert JPG to PDF
Request:
POST https://v2.convertapi.com/jpg/to/pdf?secret=XXX&storefile=true&file=https://example.com/file1.jpg
Result:
{
"ConversionTime": 2,
"Files": [
{
"FileName": "file1.pdf",
"FileSize": 19456,
"Url": "https://v2.convertapi.com/d/ZYXFNMPT/file1.pdf"
}
]
}
Request:
POST https://v2.convertapi.com/jpg/to/pdf?secret=XXX&storefile=true&file=https://example.com/file2.jpg
Result:
{
"ConversionTime": 2,
"Files": [
{
"FileName": "file2.pdf",
"FileSize": 19456,
"Url": "https://v2.convertapi.com/d/QAZEDCTG/file2.pdf"
}
]
}
Merge result PDF files
POST https://v2.convertapi.com/pdf/to/merge?Secret=XXXXXXXXXXXXXXXX
{
"Parameters": [
{
"Name": "Files",
"FileValues": [
{
"Url": "https://v2.convertapi.com/d/ZYXFNMPT/file1.pdf"
},
{
"Url": "https://v2.convertapi.com/d/QAZEDCTG/file2.pdf"
}
]
}
]
}

Related

Not able to filter out required property in Azure TSI Gen1 Get Events API response

I am using the below request body to fetch only the required property values.
"searchSpan": {
"from": {
"dateTime": "2021-11-20T00:00:00.000Z"
},
"to": {
"dateTime": "2021-11-20T23:00:00.000Z"
}
},
"predicateString": "[Params.Name] = 'power'",
"take": 100
}
}
The URL is like below:
https://12345678a-bcde-3e91-blah-2292933292aa.env.timeseries.azure.com/events?api-version=2016-12-12
Despite specifying the required property the response returns all properties as if it has not seen the predicate string. What might I be doing wrong?
{
"warnings": [],
"events": [
{
"schema": {
"rid": 0,
"$esn": "my-event-hub",
"properties": [
{
"name": "mytimestamp",
"type": "DateTime"
},
{
"name": "Params.Name",
"type": "String"
},
{
"name": "Params.Value",
"type": "Double"
}
]
},
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"energy",
60
]
},
{
"schemaRid": 0,
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"power",
10
]
},
{
"schemaRid": 0,
"$ts": "2021-11-20T10:01:50Z",
"values": [
"2021-11-20T10:01:50Z",
"strength",
200
]
},
]
}
Edit
I'm getting "Properties count error" in the TSI overview page. This might quite be the root cause but I don't know for sure
"For Time Series Insights environment ABC: You have used all 641/600 properties in your environment".

Logic App > Cosmos PartitionKey Not Matching Error

I'm scared to put this out there because it should be so easy and I am facing the same issue as the post here, here and here and I have tried each of the answers to no avail. Below is the current Resulting Input (redacted) and Related CodeView of the inputs.
The Result
{
"method": "post",
"headers": {
"x-ms-documentdb-raw-partitionkey": "\"2020\""
},
"path": "/dbs/xxxx/colls/smtp/docs",
"host": {
"connection": {
"name": "/subscriptions/..."
}
},
"body": {
"category": [
[
"cat facts"
]
],
"email": "example#test.com",
"event": "processed",
"id": "yada",
"partitionKey": "\"2020\"",
"sg_event_id": "yada yada",
"sg_message_id": "yada",
"smtp-id": "yada",
"timestamp": 1604345542
}
}
The Code View
{
"inputs": {
"body": {
"category": [
"#items('For_each')['category']"
],
"email": "#items('For_each')['email']",
"event": "#items('For_each')['event']",
"id": "#items('For_each')['sg_message_id']",
"partitionKey": "\"#{formatDateTime(utcNow(),'yyyy')}\"",
"sg_event_id": "#items('For_each')['sg_event_id']",
"sg_message_id": "#items('For_each')['sg_message_id']",
"smtp-id": "#items('For_each')['smtp-id']",
"timestamp": "#items('For_each')['timestamp']"
},
"headers": {
"x-ms-documentdb-raw-partitionkey": "\"#{formatDateTime(utcNow(),'yyyy')}\""
}
}
The error I'm getting is the usual one - PartitionKey extracted from document doesn't match the one specified in the header
I just can't see what I'm missing here now.
Thanks all.
First, as Matias comments, check your partition key path.
Then, change this code "partitionKey": "\"#{formatDateTime(utcNow(),'yyyy')}\"", to "partitionKey": "#{formatDateTime(utcNow(),'yyyy')}", in your document.
It works fine on my side:

Data Factory: JSON data is interpreted as expression - ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression

I want to copy items from
CosmosDB databaseA/productCollection
to
CosmosDB databaseB/productCollection
Therefore I decided to use Azure Data Factory.
I actived also "Export as-is to JSON files or Cosmos DB collection".
The read operation works as expected.
Unfortunately, the write operation stops because of an error related to the data:
ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'Currency'
{
"ProductName": "Sample",
"Price": {
"#Currency": "GBP",
"$": "2624.83"
}
}
I'm not able to change to input data itself.
The output data has to equal the input data.
Is there possiblity, that #Currency will not be interpreted as an expression
In ARM, this part is failling:
Price.{#Currency}
I had the same problem and I was able to resolve accordingly.
I am using a Pipeline with a Source that is a Dataset referencing JSON data.
Clicking the button highlighted below.
I had to change the JSON from
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"#context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
To ( Escaping the # with ## )
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"##context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
I tried to reproduce your issue but it works for me. I used copy activity to transfer data from account A to account B.
Additional, if this operation is just need to be executed once, please consider using Azure Cosmos DB Migration Tool. It's free for usage. You could export the data from cosmos db A as json file then import it into cosmos db B very simply.Also, it could be executed in the cmd so that it could be made as a scheduled job on the windows system.

Using Pact.eachLike() when array contents vary for each item

Hi I have a Consumer test produced using Pact NPM https://www.npmjs.com/package/pact
I use the following code to generate a pact.json:
provider
.addInteraction({
state: 'test',
uponReceiving: 'a test,
withRequest: {
method: 'GET',
path: '/test'
},
willRespondWith: {
status: 200,
headers: { 'Content-Type': 'application/json' }
body: {
"company": like("My big company"),
"factories": eachLike({
"location": like("Sydney"),
"capacity": like(5)
},{min:1})
}
}
})
.then(function(){ done(); });
It generates the following testconsumer-testprovider.json file:
{
"consumer": {
"name": "TestConsumer"
},
"provider": {
"name": "TestProvider"
},
"interactions": [
{
"description": "a request for loans",
"providerState": "broker is logged in, list all loans",
"request": {
"method": "GET",
"path": "/test"
},
"response": {
"status": 200,
"headers": {
"Content-Type": "application/vnd.hal+json"
},
"body": {
"company": "My big company",
"factories": [
{
"location": "Sydney",
"capacity": 5
}
]
},
"matchingRules": {
"$.headers.Content-Type": {
"match": "regex",
"regex": "application\\/.*json.*"
},
"$.body.company": {
"match": "type"
},
"$.body.factories": {
"min": 1
},
"$.body.factories[*].*": {
"match": "type"
},
"$.body.factories[*].location": {
"match": "type"
},
"$.body.factories[*].capacity": {
"match": "type"
}
}
}
}
],
"metadata": {
"pactSpecification": {
"version": "3.0.0"
}
}
}
However when we test against the following provided output we get an error with the geographicCoords because it's an unexpected key/value:
{
"company": "My Company",
"factories": [
{
"location": "Sydney",
"capacity": 5
},
{
"location": "Sydney",
"geographicCoords": "-0.145,1.4445",
"capacity": 5,
}
]
}
So is there a was to allow unexpected key/values in arrays because we're only test for required key/values and we don't want out pact tests to fail in future when new values are added to our providers.
The scenario you are describing is supported, see https://github.com/pact-foundation/pact-js/tree/master/examples/e2e for an example.
If you were to remove, say the eligibility object and run the tests everything still works.
If you are still having troubles, please raise a defect on the pact-js repository and we'll get to the bottom of it.

copyIndex() inside a listKeys()

We're trying to deploy an ARM template which deploys a Stream Analytics job with n Event Hubs outputs depending on an input parameter.
Right now we're having success with all but the listKeys() function inside the outputs property copy loop function which gets each Event Hub's primary keys:
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', variables('clientEventHubNamespace'), parameters('clients')[copyIndex('outputs')].id, variables('clientEventHubClientSharedAccessName')), '2015-08-01').primaryKey]"
We get the error:
17:44:31 - Error: Code=InvalidTemplate; Message=Deployment template
validation failed: 'The template resource
'tailor-router-axgf7t3gtspue' at line '129' and column '10' is not
valid: The template function 'copyIndex' is not expected at this
location. The function can only be used in a resource with copy
specified. Please see https://aka.ms/arm-copy for usage details..
Please see https://aka.ms/arm-template-expressions for usage
details.'.
However, if we change this to be a specific index:
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', variables('clientEventHubNamespace'), parameters('clients')[0].id, variables('clientEventHubClientSharedAccessName')), '2015-08-01').primaryKey]"
it works.
Is copyIndex('propertyName') inside a listKeys() a supported function?
If not, is there a workaround that would achieve the same effect?
Kind regards,
Nick
Stream Analytics job resource definition:
{
"apiVersion": "2016-03-01",
"type": "Microsoft.StreamAnalytics/StreamingJobs",
"name": "[variables('routerStreamAnalyticsName')]",
"location": "[variables('location')]",
"dependsOn": [ "clientsEventHubCopy" ],
"tags": {
"boundedContext": "[variables('boundedContextName')]"
},
"properties": {
"sku": {
"name": "[parameters('routerStreamAnalyticsSkuTier')]"
},
"outputErrorPolicy": "drop",
"eventsOutOfOrderPolicy": "adjust",
"eventsOutOfOrderMaxDelayInSeconds": 0,
"eventsLateArrivalMaxDelayInSeconds": 5,
"dataLocale": "en-US",
"compatibilityLevel": "1.0",
"inputs": [
{
"name": "input0",
"properties": {
"type": "stream",
"serialization": {
"type": "Avro"
},
"datasource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"serviceBusNamespace": "[parameters('input0EventHubNamespace')]",
"sharedAccessPolicyName": "[parameters('input0EventHubSharedAccessPolicyName')]",
"sharedAccessPolicyKey": "[parameters('input0EventHubSharedAccessPolicyKey')]",
"eventHubName": "[parameters('input0EventHubName')]"
}
}
}
}
],
"transformation": {
"name": "routing",
"properties": {
"streamingUnits": "[parameters('routerStreamAnalyticsSkuTier')]",
"query": "omitted"
}
},
"copy": [
{
"name": "outputs",
"count": "[length(parameters('clients'))]",
"input": {
"name": "[parameters('clients')[copyIndex('outputs')].id]",
"properties": {
"datasource": {
"type": "Microsoft.ServiceBus/EventHub",
"properties": {
"serviceBusNamespace": "[variables('clientEventHubNamespace')]",
"sharedAccessPolicyName": "[variables('clientEventHubClientSharedAccessName')]",
"sharedAccessPolicyKey": "[listKeys(resourceId('Microsoft.EventHub/namespaces/eventhubs/authorizationRules', variables('clientEventHubNamespace'), parameters('clients')[copyIndex('outputs')].id, variables('clientEventHubClientSharedAccessName')), '2015-08-01').primaryKey]",
"eventHubName": "[parameters('clients')[copyIndex('outputs')].id]"
}
},
"serialization": {
"type": "Avro"
}
}
}
}
]
}
},
Thanks for reporting this and sorry for the inconvenience.
I just talked to the ARM team, we had an issue when copyindex was inside the index tags eg 'array[copyindex()]'. It should be fixed now.
Let us know how it goes.
Thanks,
JS - Azure Stream Analytics

Resources