We are evaluating if we can use Embedded Kafka for testing our streams app. Can we achieve the below using embedded kafka:
Use KafkaTestUtils to produce the message to the input topic running on Embedded Kafka.
Expect the streams application to pick up the message for processing (should the application be running for this ?)
Use KafkaTestUtils to consume the processed message from output topic.
Assert the expected condition.
Please let me know.Thanks.
Yes; you can do that.
There are examples in the framework tests https://github.com/spring-projects/spring-kafka/tree/main/spring-kafka/src/test/java/org/springframework/kafka/streams
Related
I'm exploring spring cloud stream binder kafka and playing with it. is there a way to know the list of channels after startup, just by printing the values in logger may be?
i'm specifically looking for errorChannels and recordMetaData, i know that there is default channel for error.
I am new to spring-kafka. Basically I have a spring boot app with Spring Kafka (spring-kafka 2.9.0 and spring-boot 2.6.4). As long as I run the program as producer or consumer I don’t see run into any issues.
But if I run the same program to produce messages to topic-A and listen to messages coming from topic-B at the same time then I run into deserialization errors in Producer (which is confusing) while sending messages to topic-A. Producer and consumer have their own configs and producer produces a different POJO to serialize and consumer de-serializes a different POJO, but I am failing to understand why consumer is invoked while messages are being produced by producer.
Can someone please help me understand what am I doing wrong?
My apologies, on further investigation I found that the issue was not with spring-kafka. Its an symptom of some other issue. The issue itself is that I am using a connector to read and write messages to a database. For some reason when producer publishes the message, sink connector is publishing messages to topic-B. Since kafka-consumer is listening to topic-B and is not configured to deserialize the newly published messages it runs into exceptions. This has nothing to do with Spring Kafka
I'm using Axon version (3.3) which seamlessly supports Kafka with annotation in the SpringBoot Main class using
#SpringBootApplication(exclude = KafkaAutoConfiguration.class)
In our use case, the command side microservice need to pick message from kafka topic rather than we expose it as Rest api. It will store the event in event store and then move it to another kafka topic for query side microservice to consume.
Since KafkaAutoCOnfiguration is disabled, I cannot use spring-kafka configuration to write a consumer. How can I consume a normal message in Axon?
I tried writing a normal Kafka spring Consumer but since Kafka Auto COnfiguration is disabled, initial trigger for the command is not picked up from the Kafka topic
I think I can help you out with this.
The Axon Kafka Extension is solely meant for Events.
Thus, it is not intended to dispatch Commands or Queries from one node to another.
This is very intentionally, as Event messages have different routing needs apposed to Command and Query messages.
Axon views Kafka a fine fit as an Event Bus and as such this is supported through the framework.
It is however not ideal for Command messages (should be routed to a single handler, always) or Query messages (can be routed to a single handler, several handlers or have a subscription model).
Thus, I you'd want to "abuse" Kafka for different types of messages in conjunction with Axon, you will have to write your own component/service for it.
I would however stick to the messaging paradigm and separate these concerns.
For far increasing simplicity when routing messages between Axon applications, I'd highly recommend trying out Axon Server.
Additionally, here you can hear/see Allard Buijze point out the different routing needs per message type (thus the reason why Axon's Kafka Extension only deals with Event messages).
I am using spring-kafka latest version and using #KafkaListener. I am using BatchListener. In the method that is listening to the list of messages i want to call the acknowledge only if the batch of records are processed. But the spring framework is not sending those messages again until I restart the application. So I used stop() and start() methods on KafkaListenerEndpointRegistry if the records were not processed but I feel like its not a good way of solving the problem. Is there a better way of handling this.
See the documentation for the SeekToCurrentBatchErrorHandler.
The SeekToCurrentBatchErrorHandler seeks each partition to the first record in each partition in the batch so the whole batch is replayed. This error handler does not support recovery because the framework cannot know which message in the batch is failing.
Im new to Spring Cloud contract. I have written the groovy contract but the wiremock tests are failing. All I see in the console is
org.junit.ComparisonFailure: expected:<[2]00> but was:<[4]00>
Can anyone please guide me how to enable more debugging ad also is there a way to print the request and response sent by wiremock?
I have set the logging.level.com.github.tomakehurst.wiremock=DEBUG in my spring boot app but no luck
If you use one of the latest version of sc-contract, WireMock should print exactly what wasn't matched. You can also read the documentation over here https://cloud.spring.io/spring-cloud-static/Finchley.RELEASE/single/spring-cloud.html#_how_can_i_debug_the_request_response_being_sent_by_the_generated_tests_client where we answer your questions in more depth. Let me copy that part for you
86.8 How can I debug the request/response being sent by the generated tests client? The generated tests all boil down to RestAssured in some
form or fashion which relies on Apache HttpClient. HttpClient has a
facility called wire logging which logs the entire request and
response to HttpClient. Spring Boot has a logging common application
property for doing this sort of thing, just add this to your
application properties
logging.level.org.apache.http.wire=DEBUG
86.8.1 How can I debug the mapping/request/response being sent by WireMock? Starting from version 1.2.0 we turn on WireMock logging to
info and the WireMock notifier to being verbose. Now you will exactly
know what request was received by WireMock server and which matching
response definition was picked.
To turn off this feature just bump WireMock logging to ERROR
logging.level.com.github.tomakehurst.wiremock=ERROR
86.8.2 How can I see what got registered in the HTTP server stub? You can use the mappingsOutputFolder property on #AutoConfigureStubRunner
or StubRunnerRule to dump all mappings per artifact id. Also the port
at which the given stub server was started will be attached.
86.8.3 Can I reference text from file? Yes! With version 1.2.0 we’ve added such a possibility. It’s enough to call file(…) method in the
DSL and provide a path relative to where the contract lays. If you’re
using YAML just use the bodyFromFile property.