Kaa - Avro logicalType - kaa

I tried to upload an Avro schema file into Kaa. I had a field of type logicalType introduced in the log schema. But after I uploaded and created the log schema, I downloaded and checked the schema. The field logicalType was removed.
Does Kaa support logicalType as supported by Avro? If not, is there any future plans to support it?

Kaa doesn't support logicalType. There are no plans to implement one but you can create a feature request on Kaa JIRA.

Related

Can we use standalone Spring Cloud Schema Registry with Confluent's KafkaAvroSerializer?

I have a project using Spring cloud stream with Kafka Streams binder. For the output of a stream, I am using Avro, with the Serde provided by Confluent(io.confluent.kafka.streams.serdes.avro.SpecificAvroSerde).
I am able to use it with the Confluent Schema Registry. Serialization and Deserialization takes place correctly.
However, I wanted to see if we can use the Spring Cloud Schema Registry Server instead of the Confluent one. I configured a standalone Schema Registry server and set the schema registry in my project to it (changed the schemaRegistryClient.endpoint and schema.registry.url properties).
When I tried it out, it seems Spring Cloud is able to work with the standalone server. It registers the schema available in the resources folder as a .avsc file. However, when I send a message, it seems the Confluent serializer continues to approach it as a Confluent Schema Registry (which has different REST endpoints from Spring Schema Registry). As a result, it gets a 405 response code.
We get the following exception(partial stack-trace)
org.apache.kafka.common.errors.SerializationException: Error registering Avro schema: <my-avro-schema>
Caused by: io.confluent.kafka.schemaregistry.client.rest.exceptions.RestClientException: Unexpected character ('<' (code 60)): expected a valid value (JSON String, Number, Array, Object or token 'null', 'true' or 'false')
at [Source: (sun.net.www.protocol.http.HttpURLConnection$HttpInputStream); line: 1, column: 2]; error code: 50005
at io.confluent.kafka.schemaregistry.client.rest.RestService.sendHttpRequest(RestService.java:230)
It seems to me that there are two possibilities:
Spring Schema Registry Server can work only with the content-type provided by Spring (specified as content-type: application/*+avro) and not with the native Serde provided by Confluent, or
There is an issue with the project configuration.
Can someone help me figure out which one is it? If it is the second one, can someone point out what is wrong?
Each schema registry provider requires a proprietary SerDe library. For example, if you would like to integrate AWS Glue Schema Registry with Kafka, then you would need Amazon's SerDe stuff. Hence, the Confluent's SerDe library expects Confluent's Schema Registry at the address specified in the schema.registry.url property.

Retrieve access from Request Connector in Mule Anypoint Studio

I want to store the access token value from request connector separately in variable.
can some one help on it.
using Oauth module 1.1
You can use the <oauth:retrieve-access-token> operation to get the token from the the token manager.
Example:
<oauth:retrieve-access-token tokenManager="tokenManagerConfig" target="accessToken"/>
I recommend to upgrade the module to the latest version available to get bug fixes. 1.1.0 is an old version by now.

What Options Are There To Protect Database Passwords In Corda?

The Corda database password is stored in the node.conf in plain text by default:
https://docs.corda.net/head/node-database.html
What options are available to avoid this?
e.g. can you use jasypt or store the value in an environment variable or a cloud vault system?
Are there samples/examples available?
it is possible to override node.conf settings using Environment variables or JVM arguments, see here for more info: https://docs.corda.net/corda-configuration-file.html#overriding-values-from-node-conf
The Enterprise version of Corda also ships with a configuration obfuscator tool which can be used to encrypt settings: https://docs.corda.r3.com/tools-config-obfuscator.html

RavenDb patch api in embedded version of the server

Is there any difference in patch api in embedded and standard version of the server?
Is there a need to configure document store in some way to enable patch api?
I'm writing a test which use embedded raven. The code works correctly on the standard version but in test it doesn't. I'm constantly receiving patch result: DocumentDoesNotExists. I`ve checked with debugger and the document exists in the store - so it is not a problem with test.
Here you can find a repro of my issue: https://gist.github.com/pblachut/c2e0e227fa3beb51f4f9403505c292bb
I`ve reached the contact in the ravendb support and I have answer for my question.
There should be no difference between embedded and normal version of the server. The problem was that I did not passed explicitly for which database I want to invoke batch command. In the result I tried to patch document in system database.
var result = await documentStore.AsyncDatabaseCommands.ForDatabase("testDb).BatchAsync(new[] {command});
I assumed that database name will be taken from the session (beacuse I get documentStore from there). But the name of database should be always passed.
var documentStore = session.Advanced.DocumentStore;

How to sending api payload content to bam

I'm using Api Manager Version 1.7.0 and BAM Version 4.2.0: installing API_Manager_Analytics toolbox I have any field values predefined (e.g payload_api, payload_apiPublisher etc...); For the requests I see them in Cassandra DB under EVENT_KS org_wso2_apimgt_statistics_request: how do I get the field values of the requests used to invoke the apis in org_wso2_apimgt_statistics_request? how do I pass the soap body payload content to BAM? Tks Gius
you can write custom data publisher as mentioned here[1] or you can use bam mediator[2] and publish in to separate stream
1.https://nadeesha678.wordpress.com/2015/12/14/how-to-publish-custom-set-of-data-from-api-manager-to-wso2-business-activity-monitor/
2.https://docs.wso2.com/display/ESB481/BAM+Mediator

Resources