Schema validation for null values with OpenAPI (swagger) JSON schema definitions - json.net

I am looking for a solution to resolve the incompatibility for handling null values for data types between Swagger (OpenAPI) data types and JSON Schema.
Our swagger file includes all our schema definitions, and I would like to use JSON.Net Schema for the schema validation step in our API tests.
A valid swagger property definition:
"description": {
"type": "string",
"nullable": true
}
will fail JSON schema validation for null values (Invalid type. Expected String but got Null).
If I replace the nullable property definition with:
"description": {
"type": ["string", "null"]
}
validation will be successful for null values, but this breaks the swagger syntax.
Structural error at components.schemas.CalendarFunctionsDto.properties.description_EN.type
should be equal to one of the allowed values
allowedValues: array, boolean, integer, number, object, string
I couldn't find an OpenAPI schema to JSON schema converter for .NET. I'm trying to figure out if there is an easy solution available using JSON.Net Schema to solve this problem. Some of our types are more complex than the example above. I am looking for a solution that works for all "nullable" types.
I would ideally like to keep valid swagger (OpenAPI 3.0) JSON syntax for the input, programmatically perform some spells in C# for all nullable properties (convert the schema, or adjust the validation, or any other creative solution), and then validate the schema using JSON.Net Schema.

Openapi 3.0.0/3.0.1/3.0.2/3.0.3 does not support the null type, it only supports nullable.
JSON Schema does not support nullable, it supports the null type.
Can you update your spec to Openapi v3.1.0?
That version supports the null type. Then you can use one of these options:
Option 1 (unlikely to work in openapi tooling because arrays of types are such a new addition)
"description": {
"type": ["string", "null"]
}
Option 2 (more likely to work with openpi tooling because type is not an array)
"description": {
"oneOf" [
{"type": "string"},
{"type": "null"},
]
}

Related

How do I create JSON type in entity in Symfony

I am actually looking for a way to validate elements in json. I thought there is a way to list them out to strictly avoid accepting wrong elements. For instance instead of "gender": "male" as illustrated below, someone could send "sex": "male" and I am trying to avoid it.
I have a data field (column) called Profile
profile = {'name': 'Payne', 'gender': 'male', 'favourites': [{'drinks': 'soda'}, {'colour': 'blue'}, {'game': 'scrabble'}], 'dob': '1962'}
I am using a third party API to populate the database using HttpClient.
My response is returning JSON and I want to make some decisions with it and store it in the database but I need to validate it in conformity with what is expected strictly.
About validation:
If you know how to do it with arrays, you can still decode it and validate it as an array, then encode it again. Symfony has a validator service but I do not know exactly how to correctly use it in all cases.
The official Symfony documentation for the Validator Service and how to use it can be found in this link and it's anchor:
https://symfony.com/doc/current/validation.html
https://symfony.com/doc/current/validation.html#using-the-validator-service
Some info about JSON in PHP:
The most typical use for json, even for updating and downloading content to and from a DataBase, is the use of json_encode() and json_decode(), which will basically help you make your arrays more "portable" in different use cases.
For example, in MySQL, which accepts the JSON type, you can insert arrays into a column by encoding them with JSON.
If you want to declare a JSON type variable in Symfony, you can do it as in this example:
/**
* #ORM\Column(type="json", ...)
*/
private $yourColumn;
Both json_encode() and json_decode() are available since PHP v5.2.0
As an example of a DB, the JSON type was added to MySQL in version 5.7.8 (https://dev.mysql.com/doc/refman/5.7/en/json.html)
You should take a look at these links:
https://www.php.net/manual/es/function.json-encode.php
https://www.php.net/manual/es/function.json-decode.php
https://www.w3schools.com/js/js_json_php.asp
https://dev.mysql.com/doc/refman/8.0/en/

Gremlin.NET deserialize number property

I have a query that creates a vertex and an edge "Created". The edge has a property "on" which is unix datetime long. When I execute a query with the following segment in the Azure Cosmos DB terminal, it works as expected - an object is returned with an "On" property that is a number.
.project('Id', 'CreatedOn')
.by('id')
.by(
select('createdEdge')
.by('on')
)
When I execute this query from my application code using Gremlin.NET, it fails with the error:
JSON type not supported.
I see in the source code that the deserialization logic of Gremlin.NET does not seem to handle any number types. Is this really the case? Is there no way to use long, float, int property types?
Gremlin.NET does not seem to handle any number types. Is this really the case? Is there no way to use long, float, int property types?
Gremlin.NET does of course support serialization of numerical types. TinkerPop however has its own serialization formats, one of which is the JSON based GraphSON format that is also supported by Cosmos DB.
GraphSON basically serializes everything as an object in JSON that consists of a type key and the value. So, an integer will be serialized like this:
{
"#type" : "g:Int32",
"#value" : 100
}
This type identifier was added with GraphSON 2 and only used for really all types in GraphSON 3. GraphSON 2 still serialized some types without this type identifier. The GraphSON 2 spec describes this like this:
With GraphSON 2.0, there are essentially two type formats:
A non-typed value which is assumed the type implied by JSON. These non-types are limited to String, Boolean, Map and Collection.
All other values are typed by way of a "complex object" that defines a #typeId and #value.
As you can see, numerical types are not listed as types that don't have the type identifier.
The Cosmos DB docs mention that they only support GraphSON 2, but it looks like they serialize numerical types without this type identifier. This used to work with Gremlin.NET versions < 3.5.0 as that still had the logic to deserialize numerical types without a type identifier. This was simply a leftover from the GraphSON 1 format which didn't have these type identifiers.
Gremlin.NET 3.5.0 came with major changes to the JSON serialization, mostly due to the switch from Newtonsoft.JSON to System.Text.Json, and this logic was removed as part of that change.
So, it looks like you need to stay on Gremlin.NET 3.4 until Cosmos DB fixes their serialization.
Note there is a work-around as discussed here: https://www.mail-archive.com/dev#tinkerpop.apache.org/msg22532.html
In short, you can create an custom reader derived from GraphSON2Reader:
public class CustomGraphSON2Reader : GraphSON2Reader
{
public override dynamic ToObject(JsonElement graphSon) =>
graphSon.ValueKind switch
{
// numbers
JsonValueKind.Number when graphSon.TryGetInt32(out var intValue) => intValue,
JsonValueKind.Number when graphSon.TryGetInt64(out var longValue) => longValue,
JsonValueKind.Number when graphSon.TryGetDecimal(out var decimalValue) => decimalValue,
_ => base.ToObject(graphSon)
};
}
and then pass it into your client:
var client = new GremlinClient(server, new GraphSON2MessageSerializer(new CustomGraphSON2Reader()));

Sparse fields on complex JSON API attributes

According to #document-resource-object-attributes it is allowed to have 'complex' values for attributes, i.e. any valid JSON value.
With #fetching-sparse-fieldsets it is possible to select a subset of the content. However, all examples are matching the attribute name.
For example:
{
"data": [
{
"type": "dogs",
"id": "3f02e",
"attributes": {
"name": "doggy",
"body": {
"head": "small",
"legs": [
{
"position": "front",
"side": "right"
},
{
"position": "front",
"side": "left"
}
],
"fur": {
"color": "brown"
}
}
}
}
]
In the result I am only interested in the name, body.head and body.fur.color.
What would be a correct way to solve this (preferably without requiring relations, since this data is valid)?
JSON:API's Sparse Fieldsets feature allows to request only specific fields of a resource:
A client MAY request that an endpoint return only specific fields in the response on a per-type basis by including a fields[TYPE] parameter.
https://jsonapi.org/format/#fetching-sparse-fieldsets
A field is either an attribute or a relationship in JSON:API:
A resource object’s attributes and its relationships are collectively called its “fields”.
https://jsonapi.org/format/#document-resource-object-fields
Sparse Fieldsets are not meant to have an impact on the value of an attribute or a relationship. If you have such a need you shouldn't model the data as a complex value but expose it as a separate resource.
Please note that there is no need that your database schema and the exposed resources by your API are the same. Actually it often makes sense to not have a 1-to-1 relationship between database tables and resources in your JSON:API.
Don't be afraid of having multiple resources. It's often much better for the long-term than having one resource with complex objects:
You can include the related resource (e.g. dog-bodies, dog-legs, dog-furs in your case) by default.
You can generate the IDs for that resources automatically based on the persisted ID of a parent resource.
You can have much stricter constraints and easier documentation for you API if having separate resources.
You can reduce the risk of collisions as you can support updating specific parts (e.g. color attribute of a dog-furs) rather than replacing the full body value of a dogs resource.
The main drawback that I see currently with having multiple resources instead of one is the limitation that you can't create or update more than one resource in the same request with JSON:API v1.0. But it's very likely that the upcoming v1.1 won't have that limitation anymore. An official existing called Atomic Operations is proposed for that use case by a member of the core team working on the spec.

Cannot create a new entity (created with ECK) through API using REST module

Here is my situation: I am using the ECK module with Drupal 8 to create entities and bundles, and the new REST core module to create API features.
I have installed the REST_UI module, and I enabled the route for the entity I am interested in.
Here's my issue: I created an entity type and a bundle with ECK, and I can then create a new entity when I am calling the /entity/entity_type_name endpoint with a POST request, giving the following parameter as json:
{
"type":[{"target_id":"bundle_name"}],
"field_test_text":[{"value":"test"}]
}
However, this is only working when I have only one entity type in my list of entities; Let's say for example I decide to create a new entity type, then run the same request, I got the following error message:
Drupal\Core\Entity\Exception\AmbiguousEntityClassException: Multiple entity types found for Drupal\eck\Entity\EckEntity
I understand that apparently, now that I have multiple entity types, the Entity API is not able to understand what should be the type of the entity it has to create (which I find pretty weird, considering that I am providing it in the URL under this form /entity/entity_type_name and that there are different routes available for the different types of entities that I have).
I guess I need to pass an extra parameter in my json for Drupal to understand what kind of entity it should create, but what would be this parameter ? I've been trying to look online and in the documentation, but I cannot figure out how to do this.
I had the same problem, and here is how I resolved it:
Enable the HAL module.
Enable hal_json under Accepted request formats in /admin/config/services/rest for
that particular resource.
Then, in your POST request, use headers:
Content-Type: application/hal+json
X-CSRF-Token: [AUTH SESSION TOKEN]
And the body of the request being:
{
"_links": {
"type": {
"href": "http://localhost:8080/rest/type/[ENTITY_TYPE]/[ENTITY_BUNDLE]"
}
},
"title":[
{"value": "This is a new entity title"}
],
"field_example":[
{"value": "This is an example of a custom text field value."}
]
}
Drupal is reading the entity type and bundle from the _links.type.href string.
For example, if your entity type was automobile and your bundle was car, your URL would be "http://localhost:8080/rest/type/automobile/car"

How to send Json to Azure Appinsights with c# library

I'm implementing the Azure's Application Insights and the API I found is I can only send there Dictionary of type string and string. Also if I use TraceTelemetry it has properties on it which again is dictionary of string and string.
However when I add one field to the custom properties (cars in my case) which has value of serialized json it will result in such a payload being sent to the Application Insights.
"baseData": {
"ver": 2,
"message": "Test Message",
"properties": {
"cars": "[{\"Id\":0,\"Price\":{\"Value\":12.32,\"Currency\":.....
}
}
notice the backslash making it one json value.
But the appinsight portal will understand it - and parse it.
So I can use Microsoft provided C# API but it just looks ugly and seems like the API is JSON anyway, so why is API limited to Dictionary<string, string> ?
It is because of filtering in Azure Portal. The main purpose of Properties (Dictionary<string, string>) is provide the ability to find specified requests, exceptions etc. You are also limited by count of properties (it was about 200). The typical properties are: "username", "isAuthenticated", "role", "score", "isAnonymous", "portalName", "group", "product" atc. Typically global properties.
If you want to send whole object / json, you can use TrackTrace(). You can find all the traces regarding to concrete request in portal.

Resources