How can I create a Pydantic Model for my FastAPI endpoint when my JSON is something like this? - fastapi

{
'events': [
{
'type': 'message',
'replyToken': '0bc647fc5282423cde13fffbc947a8',
'source': {
'userId': 'U996ed69353d3c962ee17b33d9af3e2',
'type': 'user'
},
'timestamp': 161185209914,
'mode': 'active',
'message': {
'type': 'text',
'id': '1346188304367',
'text': ' hello'
}
}
],
'destination': 'Uf44eb3ba6c4b87adbfaa4a517e'
}
this is the json from the webhook I'm using
it's contained like this, how can I write a Pytantic model for it?

This should work fine with the request body you given above.
from pydantic import BaseModel
from typing import List
class Source(BaseModel):
userId: str
type: str
class Message(BaseModel):
type: str
id: str
text: str
class Event(BaseModel):
type: str
replyToken: str
source: Source
timestamp: int
mode: str
message: Message
class Webhook(BaseModel):
events: List[Event]
destination: str
Here is the OpenAPI schema for Webhook model.
{
"title": "Webhook",
"type": "object",
"properties": {
"events": {
"title": "Events",
"type": "array",
"items": {
"$ref": "#/definitions/Event"
}
},
"destination": {
"title": "Destination",
"type": "string"
}
},
"required": [
"events",
"destination"
],
"definitions": {
"Source": {
"title": "Source",
"type": "object",
"properties": {
"userId": {
"title": "Userid",
"type": "string"
},
"type": {
"title": "Type",
"type": "string"
}
},
"required": [
"userId",
"type"
]
},
"Message": {
"title": "Message",
"type": "object",
"properties": {
"type": {
"title": "Type",
"type": "string"
},
"id": {
"title": "Id",
"type": "string"
},
"text": {
"title": "Text",
"type": "string"
}
},
"required": [
"type",
"id",
"text"
]
},
"Event": {
"title": "Event",
"type": "object",
"properties": {
"type": {
"title": "Type",
"type": "string"
},
"replyToken": {
"title": "Replytoken",
"type": "string"
},
"source": {
"$ref": "#/definitions/Source"
},
"timestamp": {
"title": "Timestamp",
"type": "integer"
},
"mode": {
"title": "Mode",
"type": "string"
},
"message": {
"$ref": "#/definitions/Message"
}
},
"required": [
"type",
"replyToken",
"source",
"timestamp",
"mode",
"message"
]
}
}
}

Related

How do one work with third party resources in ARM templates?

I'm trying to find out where can I find template reference for SendGrid resource in Azure. It's possible to deploy SendGrid through ARM but I can not find any documentation with details about parameters etc.
Here is a sample of SendGrid account for you.
{
"$schema": "http://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"name": {
"type": "String"
},
"location": {
"type": "String"
},
"plan_name": {
"type": "String"
},
"plan_publisher": {
"type": "String"
},
"plan_product": {
"type": "String"
},
"plan_promotion_code": {
"type": "String"
},
"password": {
"type": "SecureString"
},
"email": {
"type": "String"
},
"firstName": {
"type": "String"
},
"lastName": {
"type": "String"
},
"company": {
"type": "String"
},
"website": {
"type": "String"
},
"acceptMarketingEmails": {
"type": "String"
},
"tags": {
"type": "Object"
}
},
"resources": [{
"type": "Sendgrid.Email/accounts",
"apiVersion": "2015-01-01",
"name": "[parameters('name')]",
"location": "[parameters('location')]",
"tags": "[parameters('tags')]",
"plan": {
"name": "[parameters('plan_name')]",
"publisher": "[parameters('plan_publisher')]",
"product": "[parameters('plan_product')]",
"promotionCode": "[parameters('plan_promotion_code')]"
},
"properties": {
"password": "[parameters('password')]",
"acceptMarketingEmails": "[parameters('acceptMarketingEmails')]",
"email": "[parameters('email')]",
"firstName": "[parameters('firstName')]",
"lastName": "[parameters('lastName')]",
"company": "[parameters('company')]",
"website": "[parameters('website')]"
}
}]
}

Write r GET query from data provided

I have JSON data about R request that i need to write with necessary url, access token, parameters and other things, which must be included in httr query. I got it from getpostman item. Here it is:
{
"name": "GET /tablesbyquery",
"protocolProfileBehavior": {
"disableBodyPruning": true
},
"request": {
"auth": {
"type": "bearer",
"bearer": [
{
"key": "token",
"value": "hnjP4YUF-woR0jhUyJIByeOI_q8jF99jK5WlQ", #fake for example
"type": "string"
}
]
},
"method": "GET",
"header": [],
"body": {
"mode": "raw",
"raw": "",
"options": {
"raw": {
"language": "json"
}
}
},
"url": {
"raw": "http://somesource.os-pub.com/tables/tablesbyquery?timePeriod=actual&sort=utdDate,asc&query=&isNotEmptyStatus=false&size=20&page=0&providerId=",
"protocol": "http",
"host": [
"somesource",
"os-pub",
"com"
],
"path": [
"tables",
"tablesbyquery"
],
"query": [
{
"key": "timePeriod",
"value": "actual"
},
{
"key": "sort",
"value": "utdDate,asc"
},
{
"key": "query",
"value": ""
},
{
"key": "isNotEmptyStatus",
"value": "false"
},
{
"key": "size",
"value": "20"
},
{
"key": "page",
"value": "0"
},
{
"key": "providerId",
"value": ""
}
]
}
},
"response": []
}
I need to write R request query to GET necessary data. How it should look like? Im new in url and i have written this part:
url <- "http://somesource.os-pub.com/tables/tablesbyquery?timePeriod=actual&sort=utdDate,asc&query=&isNotEmptyStatus=false&size=20&page=0&providerId="
access_token <- "hnjP4YUF-woR0jhUyJIByeOI_q8jF99jK5WlQ"
res <- GET(url, access_token)
content(res, "parsed")
But it doesn't work.

How to get rid of "API must not have local definitions (i.e. only $refs are allowed)" Swaggerhub standardization error with Springfox

I have swagger api-docs.json definition generated by SpringFox.
Below minimal-reproducible-example:
{
"swagger": "2.0",
"info": {
"description": "Example REST API.",
"version": "15.11.02",
"title": "Example REST API",
"contact": {
"name": "ExampleTeam",
"url": "https://example.com/",
"email": "support#example.com"
},
"license": {
"name": "Apache License 2.0",
"url": "https://www.apache.org/licenses/LICENSE-2.0.txt"
}
},
"host": "d01088db.ngrok.io",
"basePath": "/cloud",
"tags": [
{
"name": "All Endpoints",
"description": " "
}
],
"paths": {
"/api/v2/users/{userId}/jobs/{jobId}": {
"get": {
"tags": [
"Builds",
"All Endpoints"
],
"summary": "Get job.",
"operationId": "getJobUsingGET",
"produces": [
"*/*"
],
"parameters": [
{
"name": "jobId",
"in": "path",
"description": "jobId",
"required": true,
"type": "integer",
"format": "int64"
},
{
"name": "userId",
"in": "path",
"description": "userId",
"required": true,
"type": "integer",
"format": "int64"
}
],
"responses": {
"200": {
"description": "OK",
"schema": {
"$ref": "#/definitions/APIPipelineJob"
}
},
"401": {
"description": "Unauthorized"
},
"403": {
"description": "Forbidden"
},
"404": {
"description": "Not Found"
}
},
"deprecated": false
}
}
},
"definitions": {
"APIPipelineJob": {
"type": "object",
"properties": {
"archiveTime": {
"type": "string",
"format": "date-time",
"example": "example"
},
"content": {
"type": "string",
"example": "example"
},
"createTime": {
"type": "string",
"format": "date-time",
"example": "example"
},
"id": {
"type": "integer",
"format": "int64",
"example": "example"
},
"name": {
"type": "string",
"example": "example"
},
"selfURI": {
"type": "string",
"example": "example"
},
"type": {
"type": "string",
"example": "example",
"enum": [
"BUILD",
"DEPLOY"
]
},
"userId": {
"type": "integer",
"format": "int64",
"example": "example"
}
},
"title": "APIPipelineJob",
"xml": {
"name": "APIPipelineJob",
"attribute": false,
"wrapped": false
}
}
}
}
When I import it to SwaggerHub I got standardization error:
'definitions.*' not allowed -> API must not have local definitions (i.e. only $refs are allowed)
I have found the recommended solution in SwaggerHub documentation
But here is my question how to achieve:
split into domains(then using a reference), or
inline schemas
with Springfox
Or maybe there is another way to get rid of the above standardization error?
If you go to your home page, then hover over your organization on the left hand side and go to settings > Standardization, you should see some options. Unselect "API must not have local definitions (i.e. only $refs are allowed)" at the bottom.
And don't forget to save at the top right!

How do I pass RegistrationKey to Azure DSC extenstion

I have template below which errors out during deployment with error below. Samples on documentation page seems to be erroneous and don't even compile.
"message": "VM has reported a failure when processing extension
'Microsoft.Powershell.DSC'. Error message: \"The DSC Extension failed
to install: Invalid type for parameter RegistrationKey of type
PSCredential.\nMore information about the failure can be found in the
logs located under
'C:\WindowsAzure\Logs\Plugins\Microsoft.Powershell.DSC\2.74.0.0'
on the VM.\nTo retry install, please remove the extension from the VM
first. \"."
Template
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"resources": [
{
"name": "[parameters('swarmmanager1Name')]",
"type": "Microsoft.Compute/virtualMachines",
"location": "[resourceGroup().location]",
"apiVersion": "2015-06-15",
"tags": {
"displayName": "swarmmanager1"
},
"properties": {
"hardwareProfile": {
"vmSize": "[parameters('swarmmanager1VmSize')]"
},
"licenseType": "[parameters('LicenseType')]",
"osProfile": {
"computerName": "[parameters('swarmmanager1Name')]",
"adminUsername": "[parameters('adminUsername')]",
"adminPassword": "[parameters('adminPassword')]"
},
"storageProfile": {
"imageReference": {
"publisher": "[parameters('swarmmanager1ImagePublisher')]",
"offer": "[parameters('swarmmanager1ImageOffer')]",
"sku": "[parameters('windowsOSVersion')]",
"version": "latest"
},
"osDisk": {
"name": "swarmmanager1OSDisk",
"vhd": {
"uri": "[concat(reference(resourceId('Microsoft.Storage/storageAccounts', parameters('dockerswarmstorageaccountName')), '2016-01-01').primaryEndpoints.blob, parameters('swarmmanager1StorageAccountContainerName'), '/', parameters('swarmmanager1OSDiskName'), '.vhd')]"
},
"caching": "ReadWrite",
"createOption": "FromImage"
}
},
"networkProfile": {
"networkInterfaces": [
{
"id": "[resourceId('Microsoft.Network/networkInterfaces', parameters('swarmmanager1NicName'))]"
}
]
}
},
"resources": [
{
"name": "Microsoft.Powershell.DSC",
"type": "extensions",
"location": "[resourceGroup().location]",
"apiVersion": "2015-06-15",
"dependsOn": [
"[resourceId('Microsoft.Compute/virtualMachines', parameters('swarmmanager1Name'))]"
],
"tags": {
"displayName": "DSC"
},
"properties": {
"publisher": "Microsoft.Powershell",
"typeHandlerVersion": "2.26",
"type": "DSC",
"autoUpgradeMinorVersion": true,
"forceUpdateTag": "[parameters('DSCExtensionManagerTagVersion')]",
"settings": {
"wmfVersion": "latest",
"configurationArguments": {
//"RegistrationKey": {
// "UserName": "PLACEHOLDER_DONOTUSE",
// "Password": "PrivateSettingsRef:registrationKeyPrivate"
// },
"RegistrationKey": "[parameters('RegistrationKey')]",
"RegistrationUrl": "[parameters('registrationUrl')]",
"NodeConfigurationName": "SwarmManager.localhost",
"RebootNodeIfNeeded": true
}
},
"protectedSettings": {
"Items": {
"registrationKeyPrivate": "[parameters('RegistrationKey')]"
}
}
}
}
]
},
{
"name": "[parameters('dockerswarmstorageaccountName')]",
"type": "Microsoft.Storage/storageAccounts",
"location": "[resourceGroup().location]",
"apiVersion": "2016-01-01",
"sku": {
"name": "[parameters('dockerswarmstorageaccountType')]"
},
"dependsOn": [],
"tags": {
"displayName": "dockerswarmstorageaccount"
},
"kind": "Storage"
},
{
"name": "[parameters('swarmmanager1NicName')]",
"type": "Microsoft.Network/networkInterfaces",
"location": "[resourceGroup().location]",
"apiVersion": "2016-03-30",
"tags": {
"displayName": "swarmmanager1Nic"
},
"properties": {
"ipConfigurations": [
{
"name": "ipconfig1",
"properties": {
"privateIPAllocationMethod": "Dynamic",
"subnet": {
"id": "[parameters('swarmmanager1SubnetRef')]"
},
"publicIPAddress": {
"id": "[resourceId('Microsoft.Network/publicIPAddresses', parameters('swarmmanagerpublicIPName'))]"
}
}
}
]
}
},
{
"apiVersion": "2016-03-30",
"dependsOn": [],
"location": "[resourceGroup().location]",
"name": "[parameters('swarmmanagerpublicIPName')]",
"properties": {
"publicIPAllocationMethod": "Dynamic",
"dnsSettings": {
"domainNameLabel": "[parameters('swarmmanagerpublicIPDnsName')]"
}
},
"tags": {
"displayName": "swarmmanagerpublicIP"
},
"type": "Microsoft.Network/publicIPAddresses"
}
],
"parameters": {
"swarmmanager1Name": { "type": "string" },
"swarmmanager1VmSize": { "type": "string" },
"adminUsername": { "type": "string" },
"adminPassword": { "type": "securestring" },
"dockerswarmstorageaccountName": { "type": "string" },
"dockerswarmstorageaccountType": { "type": "string" },
"swarmmanager1NicName": { "type": "string" },
"swarmmanagerpublicIPName": { "type": "string" },
"swarmmanager1SubnetRef": { "type": "string" },
"swarmmanager1ImagePublisher": { "type": "string" },
"swarmmanager1ImageOffer": { "type": "string" },
"windowsOSVersion": { "type": "string" },
"swarmmanager1StorageAccountContainerName": { "type": "string" },
"swarmmanager1OSDiskName": { "type": "string" },
"swarmmanagerpublicIPDnsName": { "type": "string" },
"DSCConfigurationURL": { "type": "string" },
"DSCExtensionManagerTagVersion": { "type": "string" },
"RegistrationKey": { "type": "securestring" },
"RegistrationUrl": { "type": "string" },
"LicenseType": {"type": "string"}
},
"outputs": {
"returnedIPAddress": {
"type": "string",
"value": "[reference(parameters('swarmmanager1NicName')).ipConfigurations[0].properties.privateIPAddress]"
}
}
}
if you want to pass in ps credentials do this:
"protectedSettings": {
"configurationArguments": {
"RegistrationKey": {
"userName": "whatever",
"password": "[parameters('RegistrationKey')]"
}
}
}

ElasticSearch indexing issue ,failed to parse timestamp

i am new to ELK .
i have created index in Elasticsearch
{
"logstash": {
"aliases": {},
"mappings": {
"log": {
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"norms": false,
"type": "text"
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"fields": {
"keyword": {
"type": "keyword"
}
},
"norms": false,
"type": "text"
}
}
}
],
"properties": {
"#timestamp": {
"type": "date"
},
"#version": {
"type": "keyword",
"include_in_all": false
},
"activity": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"beat": {
"properties": {
"hostname": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"name": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"version": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
},
"filename": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"host": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"input_type": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"message": {
"type": "text",
"norms": false
},
"offset": {
"type": "long"
},
"source": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"tags": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"timestamp": {
"type": "date",
"include_in_all": false,
"format": "YYYY-MM-DD HH:mm:ss.SSS"
},
"type": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
},
"user": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword"
}
}
}
}
}
},
"settings": {
"index": {
"creation_date": "1488805244467",
"number_of_shards": "1",
"number_of_replicas": "0",
"uuid": "5ijhh193Tr6y_hxaQrW9kg",
"version": {
"created": "5020199"
},
"provided_name": "logstash"
}
}
}
}
Below is my logstash configuration
input{
beats{
port=>5044
}
}filter{
grok{
match=>{"message" => "\[%{TIMESTAMP_ISO8601:timestamp}\] ALL AUDIT: User \[%{GREEDYDATA:user}\] is %{GREEDYDATA:activity} \[%{GREEDYDATA:filename}\] for transfer."}
}
}output{
elasticsearch{
hosts=>"localhost:9200"
index=> "logstash"
}
Sample Data
[2017-03-05 12:37:21.465] ALL AUDIT: User [user1] is opening file [filename1] for transfer.
but when i am loading file through filebeat > logstash > elasticsearch
in elasticsearch i am getting below error
org.elasticsearch.index.mapper.MapperParsingException: failed to parse [timestamp]
Caused by: java.lang.IllegalArgumentException: Invalid format: "2017-03-05T12:36:33.606" is malformed at "12:36:33.606"
at org.joda.time.format.DateTimeParserBucket.doParseMillis(DateTimeParserBucket.java:187) ~[joda-time-2.9.5.jar:2.9.5]
Please help , what timestamp format should i configure ?
In your timestamp mapping you indicate the format as "format": "YYYY-MM-DD HH:mm:ss.SSS" Here the format you are sending through beats is not the same, check: 2017-03-05T12:36:33.606
That's why Elastic is complaining about the format. Your format should be: "YYYY-MM-DD'T'HH:mm:ss.SSS" (notice the capital T)
See the documentation for more details: https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-date-format.html

Resources