JSchemaGenerator omits "$scema" in output - json.net

I am using Newtonsoft's JSchemaGenerator to generate schema from .Net objects (see the docs).
Is there a way to make output of the generated schema to include "$schema": "http://json-schema.org/draft-04/schema#"?
It is defined in Json Schema specifications as
The "$schema" keyword is both used as a JSON Schema version identifier and the location of a resource which is itself a JSON Schema, which describes any schema written for this particular version.

Related

Can we use standalone Spring Cloud Schema Registry with Confluent's KafkaAvroSerializer?

I have a project using Spring cloud stream with Kafka Streams binder. For the output of a stream, I am using Avro, with the Serde provided by Confluent(io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde).
I am able to use it with the Confluent Schema Registry. Serialization and Deserialization takes place correctly.
However, I wanted to see if we can use the Spring Cloud Schema Registry Server instead of the Confluent one. I configured a standalone Schema Registry server and set the schema registry in my project to it (changed the schemaRegistryClient.endpoint and schema.registry.url properties).
When I tried it out, it seems Spring Cloud is able to work with the standalone server. It registers the schema available in the resources folder as a .avsc file. However, when I send a message, it seems the Confluent serializer continues to approach it as a Confluent Schema Registry (which has different REST endpoints from Spring Schema Registry). As a result, it gets a 405 response code.
We get the following exception(partial stack-trace)
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: <my-avro-schema>
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:230)
It seems to me that there are two possibilities:
Spring Schema Registry Server can work only with the content-type provided by Spring (specified as content-type: application/*+avro) and not with the native Serde provided by Confluent, or
There is an issue with the project configuration.
Can someone help me figure out which one is it? If it is the second one, can someone point out what is wrong?
Each schema registry provider requires a proprietary SerDe library. For example, if you would like to integrate AWS Glue Schema Registry with Kafka, then you would need Amazon's SerDe stuff. Hence, the Confluent's SerDe library expects Confluent's Schema Registry at the address specified in the schema.registry.url property.

BizTalk pipeline custom component disassemble with no document schema

I am going to parse and format the flat file input based on the business logic stored in SQL server database tables. I don’t have a document schema for the input. I wrote a C# custom component class for the disassemble. When I use the custom component in Disassemble stage in receive pipeline, I am getting document schema not found error.
Did anyone come across with same situation and handled it differently? .
BizTalk routes messages using the 'MessageType' property (The namespace + the root node name of the XML in the message) in the context portion of the message. You don't have that with your design so it doesn't know what to do with it.
You can:
handle each type of flat file separately by parsing and assigning a unique message type
distill the content into one type of message
wrap the content of the file in an 'envelope'
You'll need to create a schema for any of those choices.
Namespaces and routing are a spiffy way to handle changes to file structure. If you include the version of the file in the namespace BizTalk can route the message to the code that handles that kind of message for you. You can continue to handle old style messages as well as new formats. We handle pilot programs that way.

How to retrieve the source code of an XQuery module stored in eXist-db?

I am looking for a way how to retrieve the source code of an XQuery module stored in the database.
Is there any way how to do this using eXist-db's REST API or an XQuery extension function or any other eXist-db interface?
If you are using the REST Server, you have two main options:
Do a GET on the XQuery stored in the db, with the query string parameter _source=yes. You need to change some settings in $EXIST_HOME/descriptor.xml to enable that.
Write a query for retrieving queries. A query stored in the database is like any other binary document, so you could use util:binary-doc() to get it.

Unable to import enum into CTL

I'm currently working on a new Kaa iot application, and am trying to import an enum into the Common Type Library - but keep getting the following error:
Schema validation error: Schema com.company.project.SimpleEnumObject is not a record schema!
My avro description:
{
"namespace":"com.company.project",
"type":"enum",
"name":"SimpleEnumObject",
"symbols":[
"ENUM_VALUE_1",
"ENUM_VALUE_2",
"ENUM_VALUE_3"
]
}
As described in the error message, your Common Type Library (CTL) schema must be a record. Read CTL documentation for details.
Look at the Adding log schema section for example of the valid schema with enum field. Also, you can use Avro UI sandbox console to construct schema and show JSON representation.

using wcf-sql adapter

I need to poll the data in xml format and map it to the EDI 834.........
I have written the stored procedure using for xml auto,element
when i consume it using add adapter metadata i am getting a xml message....
but i need to use this xml message to map it to the EDI834 ....How to get the structure of xml so that i can use that in map....
I also followed http://social.msdn.microsoft.com/Forums/en-US/biztalkgeneral/thread/6a7e0093-0692-4ba5-9e14-0d2090c2cf54
this thread and generated the schems using xml polling and mapped that to EDI834.
But when i use the map into outbound map...It doesnt map the polling data to edi 834..
The WCF-SQL adapter removes the need to use the 'for xml auto, elements' syntax. This is a legacy leftover from the old Sql Adapter.
Just write your stored procedure in a manner to return a consistent result set, then generate metadata against the stored procedure. The adapter framework will create an appropriate schema based on the metadata returned from your stored procedure.
Then simply map the data from your WCF-SQL schema to your EDI834 schema.
Create the stored procedure that returns xml (or xml part) by using the FOR XML PATH syntax
-Setup a receive location using WCF-SQL. Select XmlPolling. Choose a rootname and namespace for the adapter to wrap around the xml returned from SQL (mandatory).
-Set Polling Statement to: exec [SPNAME]
-Set PollDataAvailableStatement to something appropriate that will return a count > 0 if there are rows/xml to be polled.
-Use passthrureceive pipeline for the receive-location
-Set up a send port (FILE) that subscribes to everything that comes from the receiveport used for the receivelocation.
-Start the application. Examine the XML returned from the adapter.
-In VS generate a schema using well-formed XML (Add->Add generated Items->Generate Schemas) (NOTE: You may have to run the InstallWFX.vbs found under the BizTalk SDK/Utilities/Schema generator, if you have not already done this earlier on the machine).
-Choose the xml file generated by the adapter (give the file a name representing the schema you are trying to create).
-Now you should have a schema representing the xml returned by the adapter, you may have to go through the schema manually and change data types to something more appropriate than what the wizard has chosen.

Resources