Is it possible to use ReplyingKafkaTemplate with Spring Cloud Stream? Is there any code example of a configuration in order to use it?
This is all in one application but illustrates how it works...
#SpringBootApplication
#EnableBinding(Processor.class)
public class So57380643Application {
public static void main(String[] args) {
SpringApplication.run(So57380643Application.class, args).close();
}
#Bean
public ReplyingKafkaTemplate<byte[], byte[], byte[]> replyer(ProducerFactory<byte[], byte[]> pf,
ConcurrentMessageListenerContainer<byte[], byte[]> replyContainer) {
return new ReplyingKafkaTemplate<>(pf, replyContainer);
}
#Bean
public ConcurrentMessageListenerContainer<byte[], byte[]> replyContainer(
ConcurrentKafkaListenerContainerFactory<byte[], byte[]> factory) {
ConcurrentMessageListenerContainer<byte[], byte[]> container = factory.createContainer("replyTopic");
container.getContainerProperties().setGroupId("replies.group");
return container;
}
#StreamListener(Processor.INPUT)
#SendTo(Processor.OUTPUT)
public String listen(String in) {
return in.toUpperCase();
}
#Bean
public ApplicationRunner runner(ReplyingKafkaTemplate<byte[], byte[], byte[]> replyer) {
return args -> {
ProducerRecord<byte[], byte[]> record = new ProducerRecord<>("requestTopic", "foo".getBytes());
RequestReplyFuture<byte[], byte[], byte[]> future = replyer.sendAndReceive(record);
RecordMetadata meta = future.getSendFuture().get(10, TimeUnit.SECONDS).getRecordMetadata();
System.out.println(meta);
ConsumerRecord<byte[], byte[]> consumerRecord = future.get(10, TimeUnit.SECONDS);
System.out.println(new String(consumerRecord.value()));
};
}
}
and
spring:
kafka:
consumer:
enable-auto-commit: false
auto-offset-reset: earliest
cloud:
stream:
bindings:
input:
destination: requestTopic
group: so57380643
output:
destination: replyTopic
result:
requestTopic-0#3
FOO
Related
I am working on the spring kafka batch listener filter strategy. I am facing an issue that, the filtered events are coming again and again. could any one help me on this issue ? spring boot with kafka version(2.3.8)
Here is my configuration:
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new
ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setBatchListener(true);
factory.setAckDiscarded(true);
factory.getContainerProperties().setIdleBetweenPolls(30000);
factory.setRecordFilterStrategy(
(consumerRecord) -> {
MyObject myObject = new ObjectMapper().readValue(consumerRecord.value(), MyObj.class);
if (myObject.frequency > 10) {
return false;
} else {
return true;
}});
factory.setBatchErrorHandler(new SeekToCurrentBatchErrorHandler());
When using batch mode with MANUAL acks, if you filter all the records (discard them all), the listener will get an empty list so you can still acknowledge the batch to commit the offsets.
I just tested it and it works as expected.
#SpringBootApplication
public class So67259790Application {
public static void main(String[] args) {
SpringApplication.run(So67259790Application.class, args);
}
#KafkaListener(id = "so67259790", topics = "so67259790")
public void listen(List<String> in, Acknowledgment ack) {
System.out.println(in);
ack.acknowledge();
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so67259790").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> {
template.send("so67259790", "foo");
template.send("so67259790", "bar");
};
}
#Bean
public RecordFilterStrategy<Object, Object> rfs() {
return rec -> true;
}
}
We are using spring cloud streams Hoxton.SR4 to consume messages from Kafka topic. We've enabled spring.cloud.stream.bindings..consumer.batch-mode=true, fetching 2000 records per poll. I would like to know if there is a way we can manually acknowledge/commit entire batch.
SR4 is quite old; the current Hoxton release is SR9 and the current spring cloud stream version is 3.0.10.RELEASE (Hoxton.SR9 pulls in 3.0.9).
You need to consume a Message and get the acknowledgment from a header.
#SpringBootApplication
public class So652289261Application {
public static void main(String[] args) {
SpringApplication.run(So652289261Application.class, args);
}
#Bean
Consumer<Message<List<Foo>>> consume() {
return msg -> {
System.out.println(msg.getPayload());
msg.getHeaders().get(KafkaHeaders.ACKNOWLEDGMENT, Acknowledgment.class).acknowledge();
};
}
#Bean
public ListenerContainerCustomizer<AbstractMessageListenerContainer<?, ?>> customizer() {
return (container, dest, group) -> container.getContainerProperties()
.setCommitLogLevel(LogIfLevelEnabled.Level.INFO);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<byte[], byte[]> template) {
return args -> {
template.send("consume-in-0", "{\"bar\":\"baz\"}".getBytes());
template.send("consume-in-0", "{\"bar\":\"qux\"}".getBytes());
};
}
public static class Foo {
private String bar;
public Foo() {
}
public Foo(String bar) {
this.bar = bar;
}
public String getBar() {
return this.bar;
}
public void setBar(String bar) {
this.bar = bar;
}
#Override
public String toString() {
return "Foo [bar=" + this.bar + "]";
}
}
}
Properties for Boot 2.3.6 and Cloud Hoxton.SR9
spring.cloud.stream.bindings.consume-in-0.group=so65228926
spring.cloud.stream.bindings.consume-in-0.consumer.batch-mode=true
spring.cloud.stream.kafka.bindings.consume-in-0.consumer.auto-commit-offset=false
spring.kafka.producer.properties.linger.ms=50
Properties for Boot 2.4.0 and Cloud 2020.0.0-M6
spring.cloud.stream.bindings.consume-in-0.group=so65228926
spring.cloud.stream.bindings.consume-in-0.consumer.batch-mode=true
spring.cloud.stream.kafka.bindings.consume-in-0.consumer.ack-mode=MANUAL
spring.kafka.producer.properties.linger.ms=50
[Foo [bar=baz], Foo [bar=qux]]
... Committing: {consume-in-0-0=OffsetAndMetadata{offset=14, leaderEpoch=null, metadata=''}}
I'd like to use a Kafka state store of type KeyValueStore in a sample application using the Kafka Binder of Spring Cloud Stream.
According to the documentation, it should be pretty simple.
This is my main class:
#SpringBootApplication
public class KafkaStreamTestApplication {
public static void main(String[] args) {
SpringApplication.run(KafkaStreamTestApplication.class, args);
}
#Bean
public BiFunction<KStream<String, String>, KeyValueStore<String,String>, KStream<String, String>> process(){
return (input,store) -> input.mapValues(v -> v.toUpperCase());
}
#Bean
public StoreBuilder myStore() {
return Stores.keyValueStoreBuilder(
Stores.persistentKeyValueStore("my-store"), Serdes.String(),
Serdes.String());
}
}
I suppose that the KeyValueStore should be passed as the second parameter of the "process" method, but the application fails to start with the message below:
Caused by: java.lang.IllegalStateException: No factory found for binding target type: org.apache.kafka.streams.state.KeyValueStore among registered factories: channelFactory,messageSourceFactory,kStreamBoundElementFactory,kTableBoundElementFactory,globalKTableBoundElementFactory
at org.springframework.cloud.stream.binding.AbstractBindableProxyFactory.getBindingTargetFactory(AbstractBindableProxyFactory.java:82) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.cloud.stream.binder.kafka.streams.function.KafkaStreamsBindableProxyFactory.bindInput(KafkaStreamsBindableProxyFactory.java:191) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.cloud.stream.binder.kafka.streams.function.KafkaStreamsBindableProxyFactory.afterPropertiesSet(KafkaStreamsBindableProxyFactory.java:103) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
I found the solution about how to use a store reading an unit test in Spring Cloud Stream.
The code below is how I applied that solution to my code.
The transformer uses the Store provided by Spring bean method "myStore"
#SpringBootApplication
public class KafkaStreamTestApplication {
public static final String MY_STORE_NAME = "my-store";
public static void main(String[] args) {
SpringApplication.run(KafkaStreamTestApplication.class, args);
}
#Bean
public Function<KStream<String, String>, KStream<String, String>> process2(){
return (input) -> input.
transformValues(() -> new MyValueTransformer(), MY_STORE_NAME);
}
#Bean
public StoreBuilder<?> myStore() {
return Stores.keyValueStoreBuilder(
Stores.persistentKeyValueStore(MY_STORE_NAME), Serdes.String(),
Serdes.String());
}
}
public class MyValueTransformer implements ValueTransformer<String, String> {
private KeyValueStore<String,String> store;
private ProcessorContext context;
#Override
public void init(ProcessorContext context) {
this.context = context;
store = (KeyValueStore<String, String>) this.context.getStateStore(KafkaStreamTestApplication.MY_STORE_NAME);
}
#Override
public String transform(String value) {
String tValue = store.get(value);
if(tValue==null) {
store.put(value, value.toUpperCase());
}
return tValue;
}
#Override
public void close() {
if(store!=null) {
store.close();
}
}
}
I am trying to write kafka consumer using spring-kafka version 2.3.0.M2 library.
To handle run time errors I am using SeekToCurrentErrorHandler.class with DeadLetterPublishingRecoverer as my recoverer. This works fine only when my consumer code throws exception, but fails when unable to deserialize the message.
I tried implementing ErrorHandler myself and I was successful but with this approach I myself end up writing DLT code to handle error messages which I do not want to do.
Below are my kafka properties
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: group_id
auto-offset-reset: latest
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
properties:
spring.json.trusted.packages: com.mypackage
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
public ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTemplate<Object, Object> template) {
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setErrorHandler(new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template), maxFailures));}
It works fine for me (note that Boot will auto-configure the error handler)...
#SpringBootApplication
public class So56728833Application {
public static void main(String[] args) {
SpringApplication.run(So56728833Application.class, args);
}
#Bean
public SeekToCurrentErrorHandler errorHandler(KafkaTemplate<String, String> template) {
SeekToCurrentErrorHandler eh = new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template), 3);
eh.setClassifier( // retry for all except deserialization exceptions
new BinaryExceptionClassifier(Collections.singletonList(DeserializationException.class), false));
return eh;
}
#KafkaListener(id = "so56728833"
+ "", topics = "so56728833")
public void listen(Foo in) {
System.out.println(in);
if (in.getBar().equals("baz")) {
throw new IllegalStateException("Test retries");
}
}
#KafkaListener(id = "so56728833dlt", topics = "so56728833.DLT")
public void listenDLT(Object in) {
System.out.println("Received from DLT: " + (in instanceof byte[] ? new String((byte[]) in) : in));
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so56728833").partitions(1).replicas(1).build();
}
#Bean
public NewTopic dlt() {
return TopicBuilder.name("so56728833.DLT").partitions(1).replicas(1).build();
}
public static class Foo {
private String bar;
public Foo() {
super();
}
public Foo(String bar) {
this.bar = bar;
}
public String getBar() {
return this.bar;
}
public void setBar(String bar) {
this.bar = bar;
}
#Override
public String toString() {
return "Foo [bar=" + this.bar + "]";
}
}
}
spring:
kafka:
consumer:
auto-offset-reset: earliest
enable-auto-commit: false
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
properties:
spring.json.trusted.packages: com.example
spring.deserializer.key.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: com.example.So56728833Application$Foo
producer:
key-serializer: org.springframework.kafka.support.serializer.JsonSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
logging:
level:
org.springframework.kafka: trace
I have 3 records in the topic:
"badJSON"
"{\"bar\":\"baz\"}"
"{\"bar\":\"qux\"}"
I see the first one going directly to the DLT, and the second one goes there after 3 attempts.
I have 2 classes; 1 for the factories and the other for listener containers:
public class ConsumerFactories() {
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Byte[]> adeKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Byte[]> factory = null;
factory = new ConcurrentKafkaListenerContainerFactory<String, Byte[]>();
factory.setConsumerFactory(consumerFactory1());
factory.setConsumerFactory(consumerFactory2());
factory.getContainerProperties().setPollTimeout(3000);
return factory;
}
}
And my listener class has multiple containers:
#Bean
public ConcurrentMessageListenerContainer<String, byte[]> adeListenerContainer() throws BeansException, ClassNotFoundException {
final ContainerProperties containerProperties =
new ContainerProperties("topic1");
containerProperties.setMessageListener(new MessageListener<String, byte[]>() {
#Override
public void onMessage(ConsumerRecord<String, byte[]> record) {
System.out.println("Thread is: " + Thread.currentThread().getName());
}
});
ConcurrentMessageListenerContainer<String, byte[]> container =
new ConcurrentMessageListenerContainer<>(consumerFactory1, containerProperties);
container.setBeanName("bean1");
container.setConcurrency(60);
container.start();
return container;
}
#Bean
public ConcurrentMessageListenerContainer<String, byte[]> adeListenerContainer() throws BeansException, ClassNotFoundException {
final ContainerProperties containerProperties =
new ContainerProperties("topic1");
containerProperties.setMessageListener(new MessageListener<String, byte[]>() {
#Override
public void onMessage(ConsumerRecord<String, byte[]> record) {
System.out.println("Thread is: " + Thread.currentThread().getName());
}
});
ConcurrentMessageListenerContainer<String, byte[]> container =
new ConcurrentMessageListenerContainer<>(consumerFactory2, containerProperties);
container.setBeanName("bean2");
container.setConcurrency(60);
container.start();
return container;
}
1) How can I write unit tests for these 2 classes and methods?
2) Since all my listener containers are doing the same processing work but for a different set of topics, can I pass the topics when I'm setting consumerFactory or any other way?
1.
container.start();
Never start() components in bean definitions - the application context is not ready yet; the container will automatically start the containers at the right time (as long as autoStartup is true (default).
Why do you need a container factory if you are creating the containers youself?
It's not clear what you want to test.
EDIT
Here's an example of programmatically registering containers, using Spring Boot's auto-configured container factory (2.2 and above)...
#SpringBootApplication
public class So53752783Application {
public static void main(String[] args) {
SpringApplication.run(So53752783Application.class, args);
}
#SuppressWarnings("unchecked")
#Bean
public SmartInitializingSingleton creator(ConfigurableListableBeanFactory beanFactory,
ConcurrentKafkaListenerContainerFactory<String, String> factory) {
return () -> Stream.of("foo", "bar", "baz").forEach(topic -> {
ConcurrentMessageListenerContainer<String, String> container = factory.createContainer(topic);
container.getContainerProperties().setMessageListener((MessageListener<String, String>) record -> {
System.out.println("Received " + record);
});
container.getContainerProperties().setGroupId(topic + ".group");
container = (ConcurrentMessageListenerContainer<String, String>)
beanFactory.initializeBean(container, topic + ".container");
beanFactory.registerSingleton(topic + ".container", container);
container.start();
});
}
}
To unit test your listener,
container.getContainerProperties().getMessagelistener()
cast it and invoke onMessage() and verify it did what you expected.
EDIT2 Unit Testing the listener
#SpringBootApplication
public class So53752783Application {
public static void main(String[] args) {
SpringApplication.run(So53752783Application.class, args);
}
#SuppressWarnings("unchecked")
#Bean
public SmartInitializingSingleton creator(ConfigurableListableBeanFactory beanFactory,
ConcurrentKafkaListenerContainerFactory<String, String> factory,
MyListener listener) {
return () -> Stream.of("foo", "bar", "baz").forEach(topic -> {
ConcurrentMessageListenerContainer<String, String> container = factory.createContainer(topic);
container.getContainerProperties().setMessageListener(listener);
container.getContainerProperties().setGroupId(topic + ".group");
container = (ConcurrentMessageListenerContainer<String, String>)
beanFactory.initializeBean(container, topic + ".container");
beanFactory.registerSingleton(topic + ".container", container);
container.start();
});
}
#Bean
public MyListener listener() {
return new MyListener();
}
public static class MyListener implements MessageListener<String, String> {
#Autowired
private Service service;
public void setService(Service service) {
this.service = service;
}
#Override
public void onMessage(ConsumerRecord<String, String> data) {
this.service.callSomeService(data.value().toUpperCase());
}
}
public interface Service {
void callSomeService(String in);
}
#Component
public static class DefaultService implements Service {
#Override
public void callSomeService(String in) {
// ...
}
}
}
and
#RunWith(SpringRunner.class)
#SpringBootTest
public class So53752783ApplicationTests {
#Autowired
private ApplicationContext context;
#Test
public void test() {
ConcurrentMessageListenerContainer<?, ?> container = context.getBean("foo.container",
ConcurrentMessageListenerContainer.class);
MyListener messageListener = (MyListener) container.getContainerProperties().getMessageListener();
Service service = mock(Service.class);
messageListener.setService(service);
messageListener.onMessage(new ConsumerRecord<>("foo", 0, 0L, "key", "foo"));
verify(service).callSomeService("FOO");
}
}