spring-kafka consumer throws deserialization exception when producer produces messages - spring-kafka

I am new to spring-kafka. Basically I have a spring boot app with Spring Kafka (spring-kafka 2.9.0 and spring-boot 2.6.4). As long as I run the program as producer or consumer I don’t see run into any issues.
But if I run the same program to produce messages to topic-A and listen to messages coming from topic-B at the same time then I run into deserialization errors in Producer (which is confusing) while sending messages to topic-A. Producer and consumer have their own configs and producer produces a different POJO to serialize and consumer de-serializes a different POJO, but I am failing to understand why consumer is invoked while messages are being produced by producer.
Can someone please help me understand what am I doing wrong?

My apologies, on further investigation I found that the issue was not with spring-kafka. Its an symptom of some other issue. The issue itself is that I am using a connector to read and write messages to a database. For some reason when producer publishes the message, sink connector is publishing messages to topic-B. Since kafka-consumer is listening to topic-B and is not configured to deserialize the newly published messages it runs into exceptions. This has nothing to do with Spring Kafka

Related

Spring Cloud Kafka - is there a way to know how many channels are there after startup

I'm exploring spring cloud stream binder kafka and playing with it. is there a way to know the list of channels after startup, just by printing the values in logger may be?
i'm specifically looking for errorChannels and recordMetaData, i know that there is default channel for error.

How to create command by consuming message from kafka topic rather than through Rest API

I'm using Axon version (3.3) which seamlessly supports Kafka with annotation in the SpringBoot Main class using
#SpringBootApplication(exclude = KafkaAutoConfiguration.class)
In our use case, the command side microservice need to pick message from kafka topic rather than we expose it as Rest api. It will store the event in event store and then move it to another kafka topic for query side microservice to consume.
Since KafkaAutoCOnfiguration is disabled, I cannot use spring-kafka configuration to write a consumer. How can I consume a normal message in Axon?
I tried writing a normal Kafka spring Consumer but since Kafka Auto COnfiguration is disabled, initial trigger for the command is not picked up from the Kafka topic
I think I can help you out with this.
The Axon Kafka Extension is solely meant for Events.
Thus, it is not intended to dispatch Commands or Queries from one node to another.
This is very intentionally, as Event messages have different routing needs apposed to Command and Query messages.
Axon views Kafka a fine fit as an Event Bus and as such this is supported through the framework.
It is however not ideal for Command messages (should be routed to a single handler, always) or Query messages (can be routed to a single handler, several handlers or have a subscription model).
Thus, I you'd want to "abuse" Kafka for different types of messages in conjunction with Axon, you will have to write your own component/service for it.
I would however stick to the messaging paradigm and separate these concerns.
For far increasing simplicity when routing messages between Axon applications, I'd highly recommend trying out Axon Server.
Additionally, here you can hear/see Allard Buijze point out the different routing needs per message type (thus the reason why Axon's Kafka Extension only deals with Event messages).

Get Failed Messages with KafkaListener

I am using spring-kafka latest version and using #KafkaListener. I am using BatchListener. In the method that is listening to the list of messages i want to call the acknowledge only if the batch of records are processed. But the spring framework is not sending those messages again until I restart the application. So I used stop() and start() methods on KafkaListenerEndpointRegistry if the records were not processed but I feel like its not a good way of solving the problem. Is there a better way of handling this.
See the documentation for the SeekToCurrentBatchErrorHandler.
The SeekToCurrentBatchErrorHandler seeks each partition to the first record in each partition in the batch so the whole batch is replayed. This error handler does not support recovery because the framework cannot know which message in the batch is failing.

spring kafka error handling and retries

Hi We are using Spring kafka 1.3.3 and our app is consume - process - publish pipeline.
How can we handle retry and seek backs if any failure in the pipeline at produce phase. Ex: app is consuming messages, process them and publish into another topic in async fashion. But if there is any error in publishing
How can i retry publishing failed message.
If the message sending is failed even after retries how can i seek back my consumer to the previous offset ? because by then my consumer position will be some where ahead in the log.
How can acknowledge the message in producer call back when the message is successfully produced.
It's easier with newer releases because you have direct access to the Consumer, but with 1.3.x you can implement ConsumerSeekAware - see the documentation. You have to perform the seek on the listener thread because the Consumer is not thread-safe.

Two-Phase commit in EJB

I getting Two-Phase commit Exption in my application for one of the datasource. Point is application only does ready only data option using Oracle Toplink. Here is what happling in Application
Request come to webservice
Webservice calls to JMS Queue. Application need response from queue so used queue with Read Respose
In Message Bean( Lets call this ProcessBean), several successful hit goes to Oracle DB using Oracle Toplink, [b]no exception is trown[/b].
After DB data read pointer goes to call to Blaze rule RMI API provided by Blaze. we get successful result.
Queue Calles Response Queue and Response Message is send back.
Now exception comes and Pointer again come to ProcessingBean
In webservice never get response back.
P.S. If you desible Global transation in Weblogic connection Pool then everything works fine. Or If I checked enable Two-Phase commit then also everything is working fine.

Resources