"AnonymousModel" Embedded Models in "Application Model" with Push Component and Postgre - push-notification

How is it possible to use the Application Model with APNS settings and Postgre.
The Application Models has embedded Models.
I'm right that in tranditional databases the embedded models are simply saved as object?
The String field of pushsettings has a varchar(1024).
The Push Example does this:
pushSettings: {
apns: {
certData: config.push.apnsCertData,
keyData: config.push.apnsKeyData,
feedbackOptions: {
batchFeedback: true,
interval: 300
}
},
gcm: {
serverApiKey: config.push.gcmServerApiKey
}
}
}
the certData and keyData are to long for the 1024 chars.
So how to use this correct with Postgres?
Right now the only thing I see is to extend the application model and set the pushSettings field to a larger value, but I am not able to get this work too.
Please Please help me
Regards

I mananged to get it work like I thought in first place.
The extended application Model json:
{
"name": "application",
"base": "Application",
"properties": {
"pushSettings": {
"postgresql": {
"dataType": "text",
"dataLength": null,
"nullable": "YES"
},
"apns": {
"production": {
"type": "boolean",
"description": [
"Production or development mode. It denotes what default APNS",
"servers to be used to send notifications.",
"See API documentation for more details."
]
},
"certData": {
"type": "string",
"description": "The certificate data loaded from the cert.pem file"
},
"keyData": {
"type": "string",
"description": "The key data loaded from the key.pem file"
},
"pushOptions": {
"type": {
"gateway": "string",
"port": "number"
}
},
"feedbackOptions": {
"type": {
"gateway": "string",
"port": "number",
"batchFeedback": "boolean",
"interval": "number"
}
}
},
"gcm": {
"serverApiKey": "string"
}
}

Related

Is there a way to expand references in Swashbuckle to provide inline schemas?

Is there a mechanism in Swashbuckle that can prevent definitions from being created with referencing to them in parameters/responses/etc.?
By default, you might get a path that looks like this:
"/profile": {
"get": {
"summary": "Get my profile details.",
"produces": [
"application/json",
],
"parameters": [],
"responses": {
"200": {
"description": "Success",
"schema": {
"$ref": "#/definitions/ProfileModel"
}
}
}
}
}
But what I'd like is for it to expand the schema inline like this:
"/profile": {
"get": {
"summary": "Get my profile details.",
"produces": [
"application/json",
],
"parameters": [],
"responses": {
"200": {
"description": "Success",
"schema": {
"type": "object",
"properties": {
"id": {
"type": "string",
"description": "id"
},
"firstName": {
"type": "string",
"description": "firstName"
},
"surname": {
"type": "string",
"description": "surname"
},
"emailAddress": {
"type": "string",
"description": "emailAddress"
}
}
}
}
}
}
}
I reviewed this StackOverflow question and I don't think it's what I'm looking for (or maybe misinterpreted).
Taken a look through the Swashbuckle README to understand its capabilities but coming up short. Any help here would be most appreciated.
For additional context, looking at the Swashbuckle PDF documentation in section 1.7, I essentially want to bypass or revert the action they describe as
automatically generating a corresponding schema for user-defined reference types and reference the definition via the $ref keyword.
Digging into the codebase a little, it looks like it's not possible at the moment.
However, you can create a custom ISchemaGenerator from the one in source and alter the GenerateConcreteSchema method under the DataType.Object case to not return as reference and this solves the issue.

Data Factory: JSON data is interpreted as expression - ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression

I want to copy items from
CosmosDB databaseA/productCollection
to
CosmosDB databaseB/productCollection
Therefore I decided to use Azure Data Factory.
I actived also "Export as-is to JSON files or Cosmos DB collection".
The read operation works as expected.
Unfortunately, the write operation stops because of an error related to the data:
ErrorCode=InvalidTemplate, ErrorMessage=Unable to parse expression 'Currency'
{
"ProductName": "Sample",
"Price": {
"#Currency": "GBP",
"$": "2624.83"
}
}
I'm not able to change to input data itself.
The output data has to equal the input data.
Is there possiblity, that #Currency will not be interpreted as an expression
In ARM, this part is failling:
Price.{#Currency}
I had the same problem and I was able to resolve accordingly.
I am using a Pipeline with a Source that is a Dataset referencing JSON data.
Clicking the button highlighted below.
I had to change the JSON from
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"#context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
To ( Escaping the # with ## )
{
"name": "SourceDataset",
"properties": {
"linkedServiceName": {
"referenceName": "StorageAccountLink",
"type": "LinkedServiceReference"
},
"annotations": [],
"type": "Json",
"typeProperties": {
"location": {
"type": "AzureBlobStorageLocation",
"container": "test-data"
}
},
"schema": {
"type": "object",
"properties": {
"##context": {
"type": "string"
},
"value": {
"type": "array",
"items": {
"type": "object",
"properties": {
"id": {
"type": "string"
}
}
}
}
}
}
}
}
I tried to reproduce your issue but it works for me. I used copy activity to transfer data from account A to account B.
Additional, if this operation is just need to be executed once, please consider using Azure Cosmos DB Migration Tool. It's free for usage. You could export the data from cosmos db A as json file then import it into cosmos db B very simply.Also, it could be executed in the cmd so that it could be made as a scheduled job on the windows system.

Swagger Schema error should NOT have additional properties

I am trying to create swagger json and trying to check it's validity in
http://editor.swagger.io
Upon validating the json, the above mentioned editor gives the following error:
Schema error should NOT have additional properties
additionalProperty: components
Jump to line 0
If for some reason I can't define an element named components at root level where i am going to have some sort of json schema, what is the right way to do a $ref on the schema for requestBody for an API operation as I intend to do as seen in my example below. Also, do you see anything wrong with my swagger json ?
My swagger json for swagger2.0 look like this.
{
"swagger": "2.0",
"info": {
"version": "1.0",
"title": "My swagger API",
"contact": {
"name": "myName",
"email": "abc#gmail.com"
}
},
"host": "localhost:1234",
"basePath": "/",
"tags": [{
"name": "someTagName",
"description": "this is a try"
}],
"components":{"schemas": {"myEndPoint": ""}},
"paths": {
"/myEndPoint": {
"post": {
"tags": ["some-tag"],
"summary": "myEndPoint endpoint will give you something",
"description": "some description will go here",
"operationId": "getMyEndPoint",
"consumes": ["application/json"],
"produces": ["application/json"],
"parameters": [{
"in": "body",
"name": "somePayload",
"description": "somePayload is what this is",
"required": true,
"schema": {
"$ref": "#components/schemas/myEndPoint"
}
},
{
"in": "header",
"name": "Authorization",
"description": "auth token goes here",
"required": true,
"type": "string"
}],
"responses": {
"200": {
"description": "Success",
"schema": {
"type": "string"
}
},
"400": {
"description": "Bad Request"
}
}
}
}
}
}
You are mixing up OpenAPI 3.0 and 2.0 syntax. The components keyword is used in OpenAPI 3.0. In OpenAPI/Swagger 2.0, reusable schemas live under definitions:
"definitions": {
"myEndPoint": {
...
}
}
Make sure to also change the $ref to
"$ref": "#/definitions/myEndPoint"

Hosting my own service (.NET) - what is a valid response for Alexa?

I'm hosting my own service (HTTPS) on Azure - I have selected 'my endpoint is a subdomain with a wildcard cert'
I'm using Alexa.NET to craft the response.
I can verify that the simulator is hitting my endpoint (I did remote debugging and saw the breakpoint was hit) and I know that my endpoint is returning this (I tried it in Postman)
{
"Version": "1.0",
"SessionAttributes": null,
"Response": {
"OutputSpeech": {
"Type": "PlainText",
"Text": "test successful"
},
"Card": null,
"Reprompt": null,
"ShouldEndSession": true,
"Directives": []
}
}
I can't find any documentation on what the response is supposed to look like. I guess I can try creating the same thing with a lambda function...
Anyone have any suggestions on what I can try? This whole process of hosting my own service has been very frustrating...
Please find sample response format here https://developer.amazon.com/public/solutions/alexa/alexa-skills-kit/docs/alexa-skills-kit-interface-reference#response-body-syntax
{
"version": "string",
"sessionAttributes": {
"string": "<object>"
},
"response": {
"outputSpeech": {
"type": "string",
"text": "string",
"ssml": "string"
},
"card": {
"type": "string",
"title": "string",
"content": "string",
"text": "string",
"image": {
"smallImageUrl": "string",
"largeImageUrl": "string"
}
},
"reprompt": {
"outputSpeech": {
"type": "string",
"text": "string",
"ssml": "string"
}
},
"directives": [
{
"type": "Display.RenderTemplate",
"template": {
"type": "string"
...
}
},
{
"type": "AudioPlayer",
"playBehavior": "string",
"audioItem": {
"stream": {
"token": "string",
"url": "string",
"offsetInMilliseconds": 0
}
}
},
{
"general": {
"type": "VideoApp.Launch",
"videoItem": {
"source": "string",
"metadata": {
"title": "string",
"subtitle": "string"
}
}
}
}
],
"shouldEndSession": boolean
}
}
It was because my 'name's started with an uppercase. Friggin Javascript serializer....
But thanks Vijay for the pointer to the documentation.
In .NET mvc, this is how you make the property names lower case:
return JsonConvert.SerializeObject(alexaSkillResponse, new JsonSerializerSettings {
ContractResolver = new CamelCasePropertyNamesContractResolver()
});

Strongloop loopback: how to set up a model to store multimedia?

Could someone give me an example of a model named "Multimedia.json" that I could use to store uploaded multimedia files (audio, video, or jpg)? Specifically, is there a type called "multimedia" or "binary"? Any strategies the Strongloop way?
Thanks,
{
"name": "Multimedia",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"title": {
"type": "string",
"required":true
},
"description": {
"type": "string"
},
"category": {
"type": "string" ---can this be a type called binary?
}
},
"validations": [],
"relations": {
"report": {
"type": "belongsTo",
"model": "Report",
"foreignKey": ""
}
},
"acls": [],
"methods": []
}
There is no built in way to do this, but StrongLoop themselves refer to this post in their documentation for the current best method to store metadata with the storage component.
https://stackoverflow.com/a/31118294/1753891
https://docs.strongloop.com/display/public/LB/Storage+component

Resources