I'm currently working on a new Kaa iot application, and am trying to import an enum into the Common Type Library - but keep getting the following error:
Schema validation error: Schema com.company.project.SimpleEnumObject is not a record schema!
My avro description:
{
"namespace":"com.company.project",
"type":"enum",
"name":"SimpleEnumObject",
"symbols":[
"ENUM_VALUE_1",
"ENUM_VALUE_2",
"ENUM_VALUE_3"
]
}
As described in the error message, your Common Type Library (CTL) schema must be a record. Read CTL documentation for details.
Look at the Adding log schema section for example of the valid schema with enum field. Also, you can use Avro UI sandbox console to construct schema and show JSON representation.
Related
I have a project using Spring cloud stream with Kafka Streams binder. For the output of a stream, I am using Avro, with the Serde provided by Confluent(io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde).
I am able to use it with the Confluent Schema Registry. Serialization and Deserialization takes place correctly.
However, I wanted to see if we can use the Spring Cloud Schema Registry Server instead of the Confluent one. I configured a standalone Schema Registry server and set the schema registry in my project to it (changed the schemaRegistryClient.endpoint and schema.registry.url properties).
When I tried it out, it seems Spring Cloud is able to work with the standalone server. It registers the schema available in the resources folder as a .avsc file. However, when I send a message, it seems the Confluent serializer continues to approach it as a Confluent Schema Registry (which has different REST endpoints from Spring Schema Registry). As a result, it gets a 405 response code.
We get the following exception(partial stack-trace)
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: <my-avro-schema>
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:230)
It seems to me that there are two possibilities:
Spring Schema Registry Server can work only with the content-type provided by Spring (specified as content-type: application/*+avro) and not with the native Serde provided by Confluent, or
There is an issue with the project configuration.
Can someone help me figure out which one is it? If it is the second one, can someone point out what is wrong?
Each schema registry provider requires a proprietary SerDe library. For example, if you would like to integrate AWS Glue Schema Registry with Kafka, then you would need Amazon's SerDe stuff. Hence, the Confluent's SerDe library expects Confluent's Schema Registry at the address specified in the schema.registry.url property.
We are using spring-kafka-2.2.7.RELEASE to produce and consume avro messages and using schema registry for schema validation with 'FORWARD_TRANSITIVE' as the compatibility type. Now, I'm trying to use 'ErrorHandlingDeserializer2 ' from spring-kafka to handle the exception/error when a deserializer fails to deserialize a message. Now I'm trying to write a component test to test this configuration. My component test expected to have below steps.
Spin up a local kafka cluster using docker containers
Send an avro message (using KafkaTemplate) with invalid schema to re-create/simulate the deserialization exception onto a test topic
Now what's happening is, since we have schema registry in place, if i send a message with new schema (invalid schema) it's validating the schema as per the compatibility type setting we have and not letting me producer the message onto kafka by throwing an exception at the producer level itself.
Now my question is, In this scenario, how can I create/simulate the creation of deserialization exception to test my configuration. Please suggest.
Note:- I don't want to disable/stop schema registry because that wouldn't simulate our prod setup.
I am going to parse and format the flat file input based on the business logic stored in SQL server database tables. I don’t have a document schema for the input. I wrote a C# custom component class for the disassemble. When I use the custom component in Disassemble stage in receive pipeline, I am getting document schema not found error.
Did anyone come across with same situation and handled it differently? .
BizTalk routes messages using the 'MessageType' property (The namespace + the root node name of the XML in the message) in the context portion of the message. You don't have that with your design so it doesn't know what to do with it.
You can:
handle each type of flat file separately by parsing and assigning a unique message type
distill the content into one type of message
wrap the content of the file in an 'envelope'
You'll need to create a schema for any of those choices.
Namespaces and routing are a spiffy way to handle changes to file structure. If you include the version of the file in the namespace BizTalk can route the message to the code that handles that kind of message for you. You can continue to handle old style messages as well as new formats. We handle pilot programs that way.
While importing an oracle schema from dump file, i am getting below error while creating tables.
ORA-14102: only one LOGGING or NOLOGGING clause may be specified.
I see the above error while creating tables from the dumpfile for several tables.
How to enable or disable LOGGING/NOLOGGING at schema level before i start import?
When performing an Oracle database export with the expdp of Oracle 11gR2 (11.2.0.1) and then importing it into the database with impdp, the following error messages appear in the import log file:
ORA-39083: Object type INDEX failed to create with error:
ORA-14102: only one LOGGING or NOLOGGING clause may be specified
This is a known Oracle 11gR2 issue. The problem is that the DBMS_METADATA.GET_DDL returns invalid syntax for an index created. So, during the index creation, both the NOLOGGING and LOGGING keywords are visible in the DDL. Download and apply Patch 8795792 from Oracle to resolve this issue.
I use OTAClient.dll from HP for conencting with Quality Center.
My connection is correct:
TDConnection tdCon = new TDConnection();
tdCon.InitConnectionEx("http://....");
tdCon.Login("username", "userpass");
tdCon.Connect("****", "********");
But in this area i get error:
Req newReq = new Req();
I set platform target in x86 but didn't help me
Instead of COM object use the rest library. You could send xml documents with parameters for creating and updating requirements(Req) in QC. You can find documentation in QC.
You will use "post" method by HttpWebRequest for creating and updating, "get" method for getting session parameters.
In the rest library look to creating entity description. For maintaining connections you must save QCSessionId. Look to authentication description.
Good luck!