processing strategy of message in spring kafka listener - spring-kafka

Just wanted to make sure that whether messages are processed in correct way or not. When the message gets received at listener, it will be always processed by a new thread( defined the processor bean as prototype). is this implementation correct ? (i have Considered the listener is not thread safe, so for this reason the prototype scope of bean to process the message has been used)
(Input : TestTopic- 5 partitions - 1 consumer) or (Input : TestTopic- 5 partitions - 5 consumers)
public class EventListener {
#Autowired
private EventProcessor eventProcessor;
#KafkaListener(topics = "TestTopic", containerFactory = "kafkaListenerContainerFactory",
autoStartup = "true")
public void onMessage(
#Payload List<ConsumerRecord<String, String>> consumerRecords, Acknowledgment acknowledgment) {
eventProcessor.processAndAcknowledgeBatchMessages(consumerRecords, acknowledgment);
}
}
//event processor
#Slf4j
#Component
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
#NoArgsConstructor
#SuppressWarnings("unused")
public class EventProcessorImpl implements EventProcessor {
#Autowired
private KafkaProducerTemplate kafkaProducerTemplate;
#Autowired
private ObjectMapper localObjectMapper;
#Autowired
private Dao dao;
public void processAndAcknowledgeBatchMessages(
List<ConsumerRecord<String, String>> consumerRecords, Acknowledgment acknowledgment) {
long start = System.currentTimeMillis();
consumerRecords.forEach( consumerRecord -> {
Event event = localObjectMapper.readValue(consumerRecord.value(), Event.class);
dao.save(process(event));
});
acknowledgment.acknowledge();
}
}

No it is not correct; you should not execute on another thread; it will cause problems with committing offsets and error handling.
Also, making the EventProcessorImpl a prototype bean won't help. That just means a new instance is used each time the bean is referenced.
Since it is #Autowired it is only referenced once, during initialization. To get a new instance for each request, you would need to call getBean() on the application context each time.
It is better to make your code thread-safe.
EDIT
There are (at least) a couple of ways to deal with a not thread-safe service defined in prototype scope.
Use a ThreadLocal:
#SpringBootApplication
public class So68447863Application {
public static void main(String[] args) {
SpringApplication.run(So68447863Application.class, args);
}
private static final ThreadLocal<NotThreadSafeService> SERVICES = new ThreadLocal<>();
#Autowired
ApplicationContext context;
#KafkaListener(id = "so68447863", topics = "so68447863", concurrency = "5")
void listen(String in) {
NotThreadSafeService service = SERVICES.get();
if (service == null) {
service = this.context.getBean(NotThreadSafeService.class);
SERVICES.set(service);
}
service.process(in);
}
#EventListener
void removeService(ConsumerStoppedEvent event) {
System.out.println("Consumer stopped; removing TL");
SERVICES.remove();
}
#Bean
NewTopic topic() {
return TopicBuilder.name("so68447863").partitions(10).replicas(1).build();
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
NotThreadSafeService service() {
return new NotThreadSafeService();
}
}
class NotThreadSafeService {
void process(String msg) {
System.out.println(msg + " processed by " + this);
}
}
Use a pool of instances.
#SpringBootApplication
public class So68447863Application {
public static void main(String[] args) {
SpringApplication.run(So68447863Application.class, args);
}
private static final BlockingQueue<NotThreadSafeService> SERVICES = new LinkedBlockingQueue<>();
#Autowired
ApplicationContext context;
#KafkaListener(id = "so68447863", topics = "so68447863", concurrency = "5")
void listen(String in) {
NotThreadSafeService service = SERVICES.poll();
if (service == null) {
service = this.context.getBean(NotThreadSafeService.class);
}
try {
service.process(in);
}
finally {
SERVICES.add(service);
}
}
#Bean
NewTopic topic() {
return TopicBuilder.name("so68447863").partitions(10).replicas(1).build();
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
NotThreadSafeService service() {
return new NotThreadSafeService();
}
}
class NotThreadSafeService {
void process(String msg) {
System.out.println(msg + " processed by " + this);
}
}

Related

Axon: Sending multiple commands to the same aggregate

If I send multiple commands to the same aggregate, only the first is handled.
Is this a configuration problem, or am I missing something?
The message I am getting after the 2nd command is send:
org.springframework.web.util.NestedServletException: Request processing failed; nested exception is org.axonframework.commandhandling.CommandExecutionException: Cannot invoke "Object.hashCode()" because "key" is null
The service method where I do my sending of the command is:
public void maakAanvraag() {
UUID aanvraagId = UUID.randomUUID();
commandGateway.sendAndWait(
VerwerkAanvraag.builder()
.aanvraagId(aanvraagId)
.build()
);
commandGateway.sendAndWait(
VerwerkPersoonsgegevensVastgesteld.builder()
.aanvraagId(aanvraagId)
.build()
);
commandGateway.sendAndWait(
VerwerkOrganisatiegegevensVastgesteld.builder()
.aanvraagId(aanvraagId)
.organisatieId(organisatieView.getOrganisatieId())
.rolOrganisatie(rolOrganisatie)
.build()
);
commandGateway.sendAndWait(
VerwerkBeperkingErkenningsdoelGematcht.builder()
.aanvraagId(aanvraagId)
.build());
}
The aggregate I am using is:
#Aggregate
#Getter
#NoArgsConstructor
public class Aanvraag {
public static final String META_DATA_ZAAKNUMMER = "aanvraag_zaaknummer";
#AggregateIdentifier
private UUID aanvraagId;
#CommandHandler
public Aanvraag(VerwerkAanvraag command) {
AanvraagGeregistreerd aanvraagGeregistreerd =
AanvraagGeregistreerd.builder()
.aanvraagId(command.getAanvraagId())
.build();
apply(aanvraagGeregistreerd, MetaData.with(META_DATA_ZAAKNUMMER, "123456789"));
}
#EventSourcingHandler
public void on(AanvraagGeregistreerd event) {
aanvraagId = event.getAanvraagId();
}
#CommandHandler
public void verwerkOrganisatiegegevensVastgesteld(VerwerkOrganisatiegegevensVastgesteld command) {
OrganisatiegegevensVastgesteld persoonsgegevensVastgesteld =
OrganisatiegegevensVastgesteld.builder()
.aanvraagId(command.getAanvraagId())
.build();
apply(persoonsgegevensVastgesteld);
}
#EventSourcingHandler
public void on(OrganisatiegegevensVastgesteld event) {
aanvraagId = event.getAanvraagId();
}
#CommandHandler
public void verwerkPersoonsgegevensVastgesteld(VerwerkPersoonsgegevensVastgesteld command) {
PersoonsgegevensVastgesteld persoonsgegevensVastgesteld =
PersoonsgegevensVastgesteld.builder()
.aanvraagId(command.getAanvraagId())
.build();
apply(persoonsgegevensVastgesteld);
}
#EventSourcingHandler
public void on(PersoonsgegevensVastgesteld event) {
aanvraagId = event.getAanvraagId();
}
#CommandHandler
public void verwerkBeperkingErkenningsdoelGematcht(VerwerkBeperkingErkenningsdoelGematcht command) {
BeperkingErkenningsdoelGematcht beperkingErkenningsdoelGematcht =
BeperkingErkenningsdoelGematcht.builder()
.aanvraagId(command.getAanvraagId())
.build();
apply(beperkingErkenningsdoelGematcht);
}
#EventSourcingHandler
public void on(BeperkingErkenningsdoelGematcht event) {
aanvraagId = event.getAanvraagId();
}
}
The project uses Spring Boot 2.6.6 with axon-spring-boot-starter 4.5.9
It al runs with Java Temurin 17.0.3
We solved the problem.
Issue had nothing to do with Axon.
The problem was an Logging interceptor.
After removing this log interceptor the Axon worked as expected.

How to bind a Store using Spring Cloud Stream and Kafka?

I'd like to use a Kafka state store of type KeyValueStore in a sample application using the Kafka Binder of Spring Cloud Stream.
According to the documentation, it should be pretty simple.
This is my main class:
#SpringBootApplication
public class KafkaStreamTestApplication {
public static void main(String[] args) {
SpringApplication.run(KafkaStreamTestApplication.class, args);
}
#Bean
public BiFunction<KStream<String, String>, KeyValueStore<String,String>, KStream<String, String>> process(){
return (input,store) -> input.mapValues(v -> v.toUpperCase());
}
#Bean
public StoreBuilder myStore() {
return Stores.keyValueStoreBuilder(
Stores.persistentKeyValueStore("my-store"), Serdes.String(),
Serdes.String());
}
}
I suppose that the KeyValueStore should be passed as the second parameter of the "process" method, but the application fails to start with the message below:
Caused by: java.lang.IllegalStateException: No factory found for binding target type: org.apache.kafka.streams.state.KeyValueStore among registered factories: channelFactory,messageSourceFactory,kStreamBoundElementFactory,kTableBoundElementFactory,globalKTableBoundElementFactory
at org.springframework.cloud.stream.binding.AbstractBindableProxyFactory.getBindingTargetFactory(AbstractBindableProxyFactory.java:82) ~[spring-cloud-stream-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.cloud.stream.binder.kafka.streams.function.KafkaStreamsBindableProxyFactory.bindInput(KafkaStreamsBindableProxyFactory.java:191) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.cloud.stream.binder.kafka.streams.function.KafkaStreamsBindableProxyFactory.afterPropertiesSet(KafkaStreamsBindableProxyFactory.java:103) ~[spring-cloud-stream-binder-kafka-streams-3.0.3.RELEASE.jar:3.0.3.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1855) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1792) ~[spring-beans-5.2.5.RELEASE.jar:5.2.5.RELEASE]
I found the solution about how to use a store reading an unit test in Spring Cloud Stream.
The code below is how I applied that solution to my code.
The transformer uses the Store provided by Spring bean method "myStore"
#SpringBootApplication
public class KafkaStreamTestApplication {
public static final String MY_STORE_NAME = "my-store";
public static void main(String[] args) {
SpringApplication.run(KafkaStreamTestApplication.class, args);
}
#Bean
public Function<KStream<String, String>, KStream<String, String>> process2(){
return (input) -> input.
transformValues(() -> new MyValueTransformer(), MY_STORE_NAME);
}
#Bean
public StoreBuilder<?> myStore() {
return Stores.keyValueStoreBuilder(
Stores.persistentKeyValueStore(MY_STORE_NAME), Serdes.String(),
Serdes.String());
}
}
public class MyValueTransformer implements ValueTransformer<String, String> {
private KeyValueStore<String,String> store;
private ProcessorContext context;
#Override
public void init(ProcessorContext context) {
this.context = context;
store = (KeyValueStore<String, String>) this.context.getStateStore(KafkaStreamTestApplication.MY_STORE_NAME);
}
#Override
public String transform(String value) {
String tValue = store.get(value);
if(tValue==null) {
store.put(value, value.toUpperCase());
}
return tValue;
}
#Override
public void close() {
if(store!=null) {
store.close();
}
}
}

Kafka Spring: How to write unit tests for ConcurrentKafkaListenerContainerFactory and ConcurrentMessageListenerContainer?

I have 2 classes; 1 for the factories and the other for listener containers:
public class ConsumerFactories() {
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Byte[]> adeKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Byte[]> factory = null;
factory = new ConcurrentKafkaListenerContainerFactory<String, Byte[]>();
factory.setConsumerFactory(consumerFactory1());
factory.setConsumerFactory(consumerFactory2());
factory.getContainerProperties().setPollTimeout(3000);
return factory;
}
}
And my listener class has multiple containers:
#Bean
public ConcurrentMessageListenerContainer<String, byte[]> adeListenerContainer() throws BeansException, ClassNotFoundException {
final ContainerProperties containerProperties =
new ContainerProperties("topic1");
containerProperties.setMessageListener(new MessageListener<String, byte[]>() {
#Override
public void onMessage(ConsumerRecord<String, byte[]> record) {
System.out.println("Thread is: " + Thread.currentThread().getName());
}
});
ConcurrentMessageListenerContainer<String, byte[]> container =
new ConcurrentMessageListenerContainer<>(consumerFactory1, containerProperties);
container.setBeanName("bean1");
container.setConcurrency(60);
container.start();
return container;
}
#Bean
public ConcurrentMessageListenerContainer<String, byte[]> adeListenerContainer() throws BeansException, ClassNotFoundException {
final ContainerProperties containerProperties =
new ContainerProperties("topic1");
containerProperties.setMessageListener(new MessageListener<String, byte[]>() {
#Override
public void onMessage(ConsumerRecord<String, byte[]> record) {
System.out.println("Thread is: " + Thread.currentThread().getName());
}
});
ConcurrentMessageListenerContainer<String, byte[]> container =
new ConcurrentMessageListenerContainer<>(consumerFactory2, containerProperties);
container.setBeanName("bean2");
container.setConcurrency(60);
container.start();
return container;
}
1) How can I write unit tests for these 2 classes and methods?
2) Since all my listener containers are doing the same processing work but for a different set of topics, can I pass the topics when I'm setting consumerFactory or any other way?
1.
container.start();
Never start() components in bean definitions - the application context is not ready yet; the container will automatically start the containers at the right time (as long as autoStartup is true (default).
Why do you need a container factory if you are creating the containers youself?
It's not clear what you want to test.
EDIT
Here's an example of programmatically registering containers, using Spring Boot's auto-configured container factory (2.2 and above)...
#SpringBootApplication
public class So53752783Application {
public static void main(String[] args) {
SpringApplication.run(So53752783Application.class, args);
}
#SuppressWarnings("unchecked")
#Bean
public SmartInitializingSingleton creator(ConfigurableListableBeanFactory beanFactory,
ConcurrentKafkaListenerContainerFactory<String, String> factory) {
return () -> Stream.of("foo", "bar", "baz").forEach(topic -> {
ConcurrentMessageListenerContainer<String, String> container = factory.createContainer(topic);
container.getContainerProperties().setMessageListener((MessageListener<String, String>) record -> {
System.out.println("Received " + record);
});
container.getContainerProperties().setGroupId(topic + ".group");
container = (ConcurrentMessageListenerContainer<String, String>)
beanFactory.initializeBean(container, topic + ".container");
beanFactory.registerSingleton(topic + ".container", container);
container.start();
});
}
}
To unit test your listener,
container.getContainerProperties().getMessagelistener()
cast it and invoke onMessage() and verify it did what you expected.
EDIT2 Unit Testing the listener
#SpringBootApplication
public class So53752783Application {
public static void main(String[] args) {
SpringApplication.run(So53752783Application.class, args);
}
#SuppressWarnings("unchecked")
#Bean
public SmartInitializingSingleton creator(ConfigurableListableBeanFactory beanFactory,
ConcurrentKafkaListenerContainerFactory<String, String> factory,
MyListener listener) {
return () -> Stream.of("foo", "bar", "baz").forEach(topic -> {
ConcurrentMessageListenerContainer<String, String> container = factory.createContainer(topic);
container.getContainerProperties().setMessageListener(listener);
container.getContainerProperties().setGroupId(topic + ".group");
container = (ConcurrentMessageListenerContainer<String, String>)
beanFactory.initializeBean(container, topic + ".container");
beanFactory.registerSingleton(topic + ".container", container);
container.start();
});
}
#Bean
public MyListener listener() {
return new MyListener();
}
public static class MyListener implements MessageListener<String, String> {
#Autowired
private Service service;
public void setService(Service service) {
this.service = service;
}
#Override
public void onMessage(ConsumerRecord<String, String> data) {
this.service.callSomeService(data.value().toUpperCase());
}
}
public interface Service {
void callSomeService(String in);
}
#Component
public static class DefaultService implements Service {
#Override
public void callSomeService(String in) {
// ...
}
}
}
and
#RunWith(SpringRunner.class)
#SpringBootTest
public class So53752783ApplicationTests {
#Autowired
private ApplicationContext context;
#Test
public void test() {
ConcurrentMessageListenerContainer<?, ?> container = context.getBean("foo.container",
ConcurrentMessageListenerContainer.class);
MyListener messageListener = (MyListener) container.getContainerProperties().getMessageListener();
Service service = mock(Service.class);
messageListener.setService(service);
messageListener.onMessage(new ConsumerRecord<>("foo", 0, 0L, "key", "foo"));
verify(service).callSomeService("FOO");
}
}

Logging MDC with #Async and TaskDecorator

Using Spring MVC, I have the following setup:
An AbstractRequestLoggingFilter derived filter for logging requests.
A TaskDecorator to marshal the MDC context mapping from the web request thread to the #Async thread.
I'm attempting to collect context info using MDC (or a ThreadLocal object) for all components involved in handling the request.
I can correctly retrieve the MDC context info from the #Async thread. However, if the #Async thread were to add context info to the MDC, how can I now marshal the MDC context info to the thread that handles the response?
TaskDecorator
public class MdcTaskDecorator implements TaskDecorator {
#Override
public Runnable decorate(Runnable runnable) {
// Web thread context
// Get the logging MDC context
Map<String, String> contextMap = MDC.getCopyOfContextMap();
return () -> {
try {
// #Async thread context
// Restore the web thread MDC context
if(contextMap != null) {
MDC.setContextMap(contextMap);
}
else {
MDC.clear();
}
// Run the new thread
runnable.run();
}
finally {
MDC.clear();
}
};
}
}
Async method
#Async
public CompletableFuture<String> doSomething_Async() {
MDC.put("doSomething", "started");
return doit();
}
Logging Filter
public class ServletLoggingFilter extends AbstractRequestLoggingFilter {
#Override
protected void beforeRequest(HttpServletRequest request, String message) {
MDC.put("webthread", Thread.currentThread().getName()); // Will be webthread-1
}
#Override
protected void afterRequest(HttpServletRequest request, String message) {
MDC.put("responsethread", Thread.currentThread().getName()); // Will be webthread-2
String s = MDC.get("doSomething"); // Will be null
// logthis();
}
}
I hope you have solved the problem, but if you did not, here comes a solution.
All you have to do can be summarized as following 2 simple steps:
Keep your class MdcTaskDecorator.
Extends AsyncConfigurerSupport for your main class and override getAsyncExecutor() to set decorator with your customized one as follows:
public class AsyncTaskDecoratorApplication extends AsyncConfigurerSupport {
#Override
public Executor getAsyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setTaskDecorator(new MdcTaskDecorator());
executor.initialize();
return executor;
}
public static void main(String[] args) {
SpringApplication.run(AsyncTaskdecoratorApplication.class, args);
}
}
Create a bean that will pass the MDC properties from parent thread to the successor thread.
#Configuration
#Slf4j
public class AsyncMDCConfiguration {
#Bean
public Executor asyncExecutor() {
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setTaskDecorator(new MDCTaskDecorator());//MDCTaskDecorator i s a custom created class thet implements TaskDecorator that is reponsible for passing on the MDC properties
executor.initialize();
return executor;
}
}
#Slf4j
public class MDCTaskDecorator implements TaskDecorator {
#Override
public Runnable decorate(Runnable runnable) {
Map<String, String> contextMap = MDC.getCopyOfContextMap();
return () -> {
try {
MDC.setContextMap(contextMap);
runnable.run();
} finally {
MDC.clear();
}
};
}
}
All Good now. Happy Coding
I have some solutions that roughly divided into Callable(for #Async), AsyncExecutionInterceptor(for #Async), CallableProcessingInterceptor(for controller).
1.The Callable solution for putting context infos into #Async thread:
The key is using the ContextAwarePoolExecutor to replace the default executor of #Async:
#Configuration
public class DemoExecutorConfig {
#Bean("demoExecutor")
public Executor contextAwarePoolExecutor() {
return new ContextAwarePoolExecutor();
}
}
And the ContextAwarePoolExecutor overwriting submit and submitListenable methods with ContextAwareCallable inside:
public class ContextAwarePoolExecutor extends ThreadPoolTaskExecutor {
private static final long serialVersionUID = 667815067287186086L;
#Override
public <T> Future<T> submit(Callable<T> task) {
return super.submit(new ContextAwareCallable<T>(task, newThreadContextContainer()));
}
#Override
public <T> ListenableFuture<T> submitListenable(Callable<T> task) {
return super.submitListenable(new ContextAwareCallable<T>(task, newThreadContextContainer()));
}
/**
* set infos what we need
*/
private ThreadContextContainer newThreadContextContainer() {
ThreadContextContainer container = new ThreadContextContainer();
container.setRequestAttributes(RequestContextHolder.currentRequestAttributes());
container.setContextMapOfMDC(MDC.getCopyOfContextMap());
return container;
}
}
The ThreadContextContainer is just a pojo to store infos for convenience:
public class ThreadContextContainer implements Serializable {
private static final long serialVersionUID = -6809291915300091330L;
private RequestAttributes requestAttributes;
private Map<String, String> contextMapOfMDC;
public RequestAttributes getRequestAttributes() {
return requestAttributes;
}
public Map<String, String> getContextMapOfMDC() {
return contextMapOfMDC;
}
public void setRequestAttributes(RequestAttributes requestAttributes) {
this.requestAttributes = requestAttributes;
}
public void setContextMapOfMDC(Map<String, String> contextMapOfMDC) {
this.contextMapOfMDC = contextMapOfMDC;
}
}
The ContextAwareCallable(a Callable proxy for original task) overwriting the call method to storage MDC or other context infos before the original task executing its call method:
public class ContextAwareCallable<T> implements Callable<T> {
/**
* the original task
*/
private Callable<T> task;
/**
* for storing infos what we need
*/
private ThreadContextContainer threadContextContainer;
public ContextAwareCallable(Callable<T> task, ThreadContextContainer threadContextContainer) {
this.task = task;
this.threadContextContainer = threadContextContainer;
}
#Override
public T call() throws Exception {
// set infos
if (threadContextContainer != null) {
RequestAttributes requestAttributes = threadContextContainer.getRequestAttributes();
if (requestAttributes != null) {
RequestContextHolder.setRequestAttributes(requestAttributes);
}
Map<String, String> contextMapOfMDC = threadContextContainer.getContextMapOfMDC();
if (contextMapOfMDC != null) {
MDC.setContextMap(contextMapOfMDC);
}
}
try {
// execute the original task
return task.call();
} finally {
// clear infos after task completed
RequestContextHolder.resetRequestAttributes();
try {
MDC.clear();
} finally {
}
}
}
}
In the end, using the #Async with the configured bean "demoExecutor" like this: #Async("demoExecutor")
void yourTaskMethod();
2.In regard to your question of handling the response:
Regret to tell that I don't really have a verified solution. Maybe the org.springframework.aop.interceptor.AsyncExecutionInterceptor#invoke is possible to solve that.
And I do not think it has a solution to handle the response with your ServletLoggingFilter. Because the Async method will be returned instantly. The afterRequest method executes immediately and returns before Async method doing things. You won't get what you want unless you synchronously wait for the Async method to finish executing.
But if you just want to log something, you can add those codes into my example ContextAwareCallable after the original task executing its call method:
try {
// execute the original task
return task.call();
} finally {
String something = MDC.get("doSomething"); // will not be null
// logthis(something);
// clear infos after task completed
RequestContextHolder.resetRequestAttributes();
try {
MDC.clear();
} finally {
}
}

SpringBoot Undertow : how to dispatch to worker thread

i'm currently have a look a springboot undertow and it's not really clear (for me) how to dispatch an incoming http request to a worker thread for blocking operation handling.
Looking at the class UndertowEmbeddedServletContainer.class, it look like there is no way to have this behaviour since the only HttpHandler is a ServletHandler, that allow #Controller configurations
private Undertow createUndertowServer() {
try {
HttpHandler servletHandler = this.manager.start();
this.builder.setHandler(getContextHandler(servletHandler));
return this.builder.build();
}
catch (ServletException ex) {
throw new EmbeddedServletContainerException(
"Unable to start embdedded Undertow", ex);
}
}
private HttpHandler getContextHandler(HttpHandler servletHandler) {
if (StringUtils.isEmpty(this.contextPath)) {
return servletHandler;
}
return Handlers.path().addPrefixPath(this.contextPath, servletHandler);
}
By default, in undertow all requests are handled by IO-Thread for non blocking operations.
Does this mean that every #Controller executions will be processed by a non blocking thread ? or is there a solution to chose from IO-THREAD or WORKER-THREAD ?
I try to write a workaround, but this code is pretty uggly, and maybe someone has a better solution:
BlockingHandler.class
#Target({ElementType.TYPE})
#Retention(RetentionPolicy.RUNTIME)
#Documented
public #interface BlockingHandler {
String contextPath() default "/";
}
UndertowInitializer.class
public class UndertowInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
#Override
public void initialize(ConfigurableApplicationContext configurableApplicationContext) {
configurableApplicationContext.addBeanFactoryPostProcessor(new UndertowHandlerPostProcessor());
}
}
UndertowHandlerPostProcessor.class
public class UndertowHandlerPostProcessor implements BeanDefinitionRegistryPostProcessor {
#Override
public void postProcessBeanDefinitionRegistry(BeanDefinitionRegistry beanDefinitionRegistry) throws BeansException {
ClassPathScanningCandidateComponentProvider scanner = new ClassPathScanningCandidateComponentProvider(false);
scanner.addIncludeFilter(new AnnotationTypeFilter(BlockingHandler.class));
for (BeanDefinition beanDefinition : scanner.findCandidateComponents("org.me.lah")){
try{
Class clazz = Class.forName(beanDefinition.getBeanClassName());
beanDefinitionRegistry.registerBeanDefinition(clazz.getSimpleName(), beanDefinition);
} catch (ClassNotFoundException e) {
throw new BeanCreationException(format("Unable to create bean %s", beanDefinition.getBeanClassName()), e);
}
}
}
#Override
public void postProcessBeanFactory(ConfigurableListableBeanFactory configurableListableBeanFactory) throws BeansException {
//no need to post process defined bean
}
}
override UndertowEmbeddedServletContainerFactory.class
public class UndertowEmbeddedServletContainerFactory extends AbstractEmbeddedServletContainerFactory implements ResourceLoaderAware, ApplicationContextAware {
private ApplicationContext applicationContext;
#Override
public EmbeddedServletContainer getEmbeddedServletContainer(ServletContextInitializer... initializers) {
DeploymentManager manager = createDeploymentManager(initializers);
int port = getPort();
if (port == 0) {
port = SocketUtils.findAvailableTcpPort(40000);
}
Undertow.Builder builder = createBuilder(port);
Map<String, Object> handlers = applicationContext.getBeansWithAnnotation(BlockingHandler.class);
return new UndertowEmbeddedServletContainer(builder, manager, getContextPath(),
port, port >= 0, handlers);
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
this.applicationContext = applicationContext;
}
}
...
override UndertowEmbeddedServletContainer.class
public UndertowEmbeddedServletContainer(Builder builder, DeploymentManager manager,
String contextPath, int port, boolean autoStart, Map<String, Object> handlers) {
this.builder = builder;
this.manager = manager;
this.contextPath = contextPath;
this.port = port;
this.autoStart = autoStart;
this.handlers = handlers;
}
private Undertow createUndertowServer() {
try {
HttpHandler servletHandler = this.manager.start();
String path = this.contextPath.isEmpty() ? "/" : this.contextPath;
PathHandler pathHandler = Handlers.path().addPrefixPath(path, servletHandler);
for(Entry<String, Object> entry : handlers.entrySet()){
Annotation annotation = entry.getValue().getClass().getDeclaredAnnotation(BlockingHandler.class);
System.out.println(((BlockingHandler) annotation).contextPath());
pathHandler.addPrefixPath(((BlockingHandler) annotation).contextPath(), (HttpHandler) entry.getValue());
}
this.builder.setHandler(pathHandler);
return this.builder.build();
}
catch (ServletException ex) {
throw new EmbeddedServletContainerException(
"Unable to start embdedded Undertow", ex);
}
}
set initializer to the application context
public static void main(String[] args) {
new SpringApplicationBuilder(Application.class).initializers(new UndertowInitializer()).run(args);
}
finaly create a HttpHandler that dispatch to worker thread
#BlockingHandler(contextPath = "/blocking/test")
public class DatabaseHandler implements HttpHandler {
#Autowired
private EchoService echoService;
#Override
public void handleRequest(HttpServerExchange httpServerExchange) throws Exception {
if(httpServerExchange.isInIoThread()){
httpServerExchange.dispatch();
}
echoService.getMessage("my message");
}
}
As you can see, my "solution" is really heavy, and i would really appreciate any help to simplify it a lot.
Thank you
You don't need to do anything.
Spring Boot's default Undertow configuration uses Undertow's ServletInitialHandler in front of Spring MVC's DispatcherServlet. This handler performs the exchange.isInIoThread() check and calls dispatch() if necessary.
If you place a breakpoint in your #Controller, you'll see that it's called on a thread named XNIO-1 task-n which is a worker thread (the IO threads are named XNIO-1 I/O-n).

Resources