Combining blocking and non-blocking retries in Spring Kafka - spring-kafka

I am trying to implement non blocking retries with single topic fixed back-off.
I am able to do so, thanks to documentation https://docs.spring.io/spring-kafka/reference/html/#single-topic-fixed-delay-retries.
Now I also need to perform a few blocked/local retries on main topic. I have been trying to implement this using DefaultErrorHandler as below:
#Bean
public DefaultErrorHandler retryErrorHandler() {
return new DefaultErrorHandler(new FixedBackOff(2000, 3));
}
This does not seem to work with RetryableTopic.
I have also tried the following approach retry-topic-combine-blocking https://docs.spring.io/spring-kafka/reference/html/#retry-topic-combine-blocking using ListenerContainerFactoryConfigurer
but issue I am facing here is creating beans KafkaConsumerBackoffManager, DeadLetterPublishingRecovererFactory and especially KafkaConsumerBackoffManager.
I need to know if this another way to achieve this using spring kafka framework or is there a way to construct above beans ?

We're currently working on improving configuration for the non-blocking retries components.
For now, as documented here, you should inject these beans such as:
#Bean(name = RetryTopicInternalBeanNames.LISTENER_CONTAINER_FACTORY_CONFIGURER_NAME)
public ListenerContainerFactoryConfigurer lcfc(KafkaConsumerBackoffManager kafkaConsumerBackoffManager,
DeadLetterPublishingRecovererFactory deadLetterPublishingRecovererFactory,
#Qualifier(RetryTopicInternalBeanNames
.INTERNAL_BACKOFF_CLOCK_BEAN_NAME) Clock clock) {
ListenerContainerFactoryConfigurer lcfc = new ListenerContainerFactoryConfigurer(kafkaConsumerBackoffManager, deadLetterPublishingRecovererFactory, clock);
lcfc.setBlockingRetryableExceptions(MyBlockingRetryException.class, MyOtherBlockingRetryException.class);
lcfc.setBlockingRetriesBackOff(new FixedBackOff(500, 5)); // Optional
return lcfc;
}}
Also, there's a known issue where if you try to inject the beans before the first #KafkaListener bean with retryable topic is processed, the feature's component's beans won't be present in the context yet and will throw an error.
Does that happen to you?
We're currently working on a fix for this, but we should be able to work around that if that's your problem.
EDIT: Since the problem is that components are not instantiated yet, the most guaranteed workaround is to provide the components yourself.
Here's a sample on how to do that. Of course, adjust it accordingly if you need any further customization.
#Configuration
public static class SO71705876Configuration {
#Bean(name = RetryTopicInternalBeanNames.LISTENER_CONTAINER_FACTORY_CONFIGURER_NAME)
public ListenerContainerFactoryConfigurer lcfc(KafkaConsumerBackoffManager kafkaConsumerBackoffManager,
DeadLetterPublishingRecovererFactory deadLetterPublishingRecovererFactory) {
ListenerContainerFactoryConfigurer lcfc = new ListenerContainerFactoryConfigurer(kafkaConsumerBackoffManager, deadLetterPublishingRecovererFactory, Clock.systemUTC());
lcfc.setBlockingRetryableExceptions(IllegalArgumentException.class, IllegalStateException.class);
lcfc.setBlockingRetriesBackOff(new FixedBackOff(500, 5)); // Optional
return lcfc;
}
#Bean(name = RetryTopicInternalBeanNames.KAFKA_CONSUMER_BACKOFF_MANAGER)
public KafkaConsumerBackoffManager backOffManager(ApplicationContext context) {
PartitionPausingBackOffManagerFactory managerFactory =
new PartitionPausingBackOffManagerFactory();
managerFactory.setApplicationContext(context);
return managerFactory.create();
}
#Bean(name = RetryTopicInternalBeanNames.DEAD_LETTER_PUBLISHING_RECOVERER_FACTORY_BEAN_NAME)
public DeadLetterPublishingRecovererFactory dlprFactory(DestinationTopicResolver resolver) {
return new DeadLetterPublishingRecovererFactory(resolver);
}
#Bean(name = RetryTopicInternalBeanNames.DESTINATION_TOPIC_CONTAINER_NAME)
public DestinationTopicResolver destinationTopicResolver(ApplicationContext context) {
return new DefaultDestinationTopicResolver(Clock.systemUTC(), context);
}
In the next release this should not be a problem anymore. Please let me know if that works for you, or if any further adjustment to this workaround is necessary.
Thanks.

Related

Unity to DryIoC conversion ParameterOverride

We are transitioning from Xamarin.Forms to .Net MAUI but our project uses Prism.Unity.Forms. We have a lot of code that basically uses the IContainer.Resolve() passing in a collection of ParameterOverrides with some primitives but some are interfaces/objects. The T we are resolving is usually a registered View which may or may not be the correct way of doing this but it's what I'm working with and we are doing it in backend code (sometimes a service). What is the correct way of doing this Unity thing in DryIoC? Note these parameters are being set at runtime and may only be part of the parameters a constructor takes in (some may be from already registered dependencies).
Example of the scenario:
//Called from service into custom resolver method
var parameterOverrides = new[]
{
new ParameterOverride("productID", 8675309),
new ParameterOverride("objectWithData", IObjectWithData)
};
//Custom resolver method example
var resolverOverrides = new List<ResolverOverride>();
foreach(var parameterOverride in parameterOverrides)
{
resolverOverrides.Add(parameterOverride);
}
return _container.Resolve<T>(resolverOverrides.ToArray());
You've found out why you don't use the container outside of the resolution root. I recommend not trying to replicate this error with another container but rather fixing it - use handcoded factories:
internal class SomeFactory : IProductViewFactory
{
public SomeFactory( IService dependency )
{
_dependency = dependency ?? throw new ArgumentNullException( nameof(dependency) );
}
#region IProductViewFactory
public IProductView Create( int productID, IObjectWithData objectWithData ) => new SomeProduct( productID, objectWithData, _dependency );
#endregion
#region private
private readonly IService _dependency;
#endregion
}
See this, too:
For dependencies that are independent of the instance you're creating, inject them into the factory and store them until needed.
For dependencies that are independent of the context of creation but need to be recreated for each created instance, inject factories into the factory and store them.
For dependencies that are dependent on the context of creation, pass them into the Create method of the factory.
Also, be aware of potential subtle differences in container behaviours: Unity's ResolverOverride works for the whole call to resolve, i.e. they override parameters of dependencies, too, whatever happens to match by name. This could very well be handled very differently by DryIOC.
First, I would agree with the #haukinger answer to rethink how do you pass the runtime information into the services. The most transparent and simple way in my opinion is by passing it via parameters into the consuming methods.
Second, here is a complete example in DryIoc to solve it head-on + the live code to play with.
using System;
using DryIoc;
public class Program
{
record ParameterOverride(string Name, object Value);
record Product(int productID);
public static void Main()
{
// get container somehow,
// if you don't have an access to it directly then you may resolve it from your service provider
IContainer c = new Container();
c.Register<Product>();
var parameterOverrides = new[]
{
new ParameterOverride("productID", 8675309),
new ParameterOverride("objectWithData", "blah"),
};
var parameterRules = Parameters.Of;
foreach (var po in parameterOverrides)
{
parameterRules = parameterRules.Details((_, x) => x.Name.Equals(po.Name) ? ServiceDetails.Of(po.Value) : null);
}
c = c.With(rules => rules.With(parameters: parameterRules));
var s = c.Resolve<Product>();
Console.WriteLine(s.productID);
}
}

Kafka Streams Serdes having nested generic doesn't work

I have following code which uses functional style to define two functions for kafka topics
#Bean
public Function<KStream<String, CloudEvent<ClassA>>, KStream<String, CloudEvent<ClassB>>> method1() {
....... //lambda
}
#Bean
public Function<KStream<String, CloudEvent<ClassB>>, KStream<String, CloudEvent<ClassC>>> method2() {
...... //lambda
}
For these two functions I define serdes so
#Bean
public Serde<CloudEventMessage<ClassA>> classASerde(ObjectMapper mapper, Validator validator) {
return StreamsSerdes.classASerde(mapper,validator);
}
#Bean
public Serde<CloudEventMessage<ClassB>> classBSerde(ObjectMapper mapper, Validator validator) {
return StreamsSerdes.classBSerde(mapper,validator);
}
This construction doesn't work as at runtime spring tries to deserialize CloudEvent<ClassB> with Serde of CloutEvent<ClassA>. Is there someway to give hint to use the correct serde for method1 and method2 ?
Secondly I could bypass the above issues by mentioning Serdes in application.properties
spring.application.cloud.stream.kafka.streams.bindings.method1-in-0.consumer.valueSerde=package.serde.StreamsSerdes$ClassASerde
spring.application.cloud.stream.kafka.streams.bindings.method2-in-0.consumer.valueSerde=package.serde.StreamsSerdes$ClassBSerde
However now I get other issues as these Serde classes don't have default constructor. I do need ObjectMapper, Validator from Spring to inject beans (#Service) to perfrom converstions/validations during deserialization.
Has anyone come across similar issues or perhaps have ideas how to resolve them ?
Thanks
I think it is a gap that the nested generics are not working right now in the binder. Do you mind creating an issue in the repository and linking this thread?
As to the second issue that you are running into when providing properties in application.properties, you can try using a workaround. The Serde interface has a configure method that takes a map.
default void configure(Map<String, ?> configs, boolean isKey) {
// intentionally left blank
}
Override this method in your Serde implementation and set those bean objects under some keys.
ObjectMapper mapper;
Validator validator;
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
this.mapper = (ObjectMapper) configs.get("mapper.key");
this.validator = (Validator) configs.get("validator.key");
}
You need to remove accessing them from the constructor and use those fields directly for deserialization and serialization.
Then you provide this bean in your application to populate the map:
#Bean
public StreamsBuilderFactoryBeanCustomizer streamsBuilderFactoryBeanCustomizer(ObjectMapper mapper, Validator validator) {
return factoryBean -> {
factoryBean.getStreamsConfiguration().put("mappeer.key", mapper);
factoryBean.getStreamsConfiguration().put("validator.key", validator);
};
}
I haven't tried this code in an application, but it is something that you can try and see if it works with your code.

aop aspects as mock in spring test

I came across an interesting article: AOP Aspects as mocks in JUnit
Since I have requirement to mock multiple final and private static variables, I am planning to use AOP in place of reflection or PowerMockito as they are causing issues with SpringJUnit4ClassRunner.
Is there any way I can use #Aspect for test classes without using the annotation #EnableAspectJAutoProxy? (I want to use an aspect targeting class X only in one test case.)
This is a sample of what I want to do.
The question is answered(adding for discussion on what could be done)
//External class
public final class ABC(){
public void method1() throws Exception {}
}
#Service
public void DestClass() {
private static final ABC abc = new ABC();
public Object m() {
// code (...)
try {
abc.method1();
}
catch(Exception e) {
// do something (...)
return null;
}
// more code (...)
}
}
Spring framework allows to programmatically create proxies that advise target objects , without configuring through #EnableAspectJAutoProxy or <aop:aspectj-autoproxy>
Details can be found in the documentation section : Programmatic Creation of #AspectJ Proxies and the implementation is pretty simple.
Example code from the documentation.
// create a factory that can generate a proxy for the given target object
AspectJProxyFactory factory = new AspectJProxyFactory(targetObject);
// add an aspect, the class must be an #AspectJ aspect
// you can call this as many times as you need with different aspects
factory.addAspect(SecurityManager.class);
// you can also add existing aspect instances, the type of the object supplied must be an #AspectJ aspect
factory.addAspect(usageTracker);
// now get the proxy object...
MyInterfaceType proxy = factory.getProxy();
Please note that with Spring AOP , only method executions can be adviced. Excerpt from the documentation
Spring AOP currently supports only method execution join points
(advising the execution of methods on Spring beans). Field
interception is not implemented, although support for field
interception could be added without breaking the core Spring AOP APIs.
If you need to advise field access and update join points, consider a
language such as AspectJ.
The document shared with the question is about aspectj and without providing the sample code to be adviced it is hard to conclude if the requriement can acheived through Spring AOP. The document mentions this as well.
One example of the integration of AspectJ is the Spring framework,
which now can use the AspectJ pointcut language in its own AOP
implementation. Spring’s implementation is not specifically targeted
as a test solution.
Hope this helps.
--- Update : A test case without using AOP ---
Consider the external Class
public class ABCImpl implements ABC{
#Override
public void method1(String example) {
System.out.println("ABC method 1 called :"+example);
}
}
And the DestClass
#Service
public class DestClass {
private static final ABC service = new ABCImpl();
protected ABC abc() throws Exception{
System.out.println("DestClass.abc() called");
return service;
}
public Object m() {
Object obj = new Object();
try {
abc().method1("test");
} catch (Exception e) {
System.out.println("Exception : "+ e.getMessage());
return null;
}
return obj;
}
}
Following test class autowires the DestClass bean with overridden logic to throw exception . This code can be modified to adapt to your requirement.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = { DestClassSpringTest.TestConfiguration.class })
public class DestClassSpringTest {
#Configuration
static class TestConfiguration {
#Bean
public DestClass destClass() {
return new DestClass() {
protected ABC abc() throws Exception {
// super.abc(); // not required . added to demo the parent method call
throw new Exception("Custom exception thrown");
}
};
}
}
#Autowired
DestClass cut;
#Test
public void test() {
Object obj = cut.m();
assertNull(obj);
}
}
Following will be the output log
DestClass.abc() called // this will not happen if the parent method call is commented in DestClassSpringTest.TestConfiguration
Exception : Custom exception thrown
The article you are referring to is using full AspectJ, not Spring AOP. Thus, you do not need any #EnableAspectJAutoProxy for that, just
either the AspectJ load-time weaver on the command line when running your test via -javaagent:/path/to/aspectjweaver.jar
or the AspectJ compiler activated when compiling your tests (easily done via AspectJ Maven plugin if you use Maven)
Both approaches are completely independent of Spring, will work in any project and even when using Spring also work when targeting execution of third party code because no dynamic proxies are needed unlike in Spring AOP. So there is no need to make the target code into a Spring bean or to create a wrapper method in your application class for it. When using compile-time weaving you can even avoid weaving into the third party library by using call() instead of execution() pointcut. Spring AOP only knows execution(), AspectJ is more powerful.
By the way: Unfortunately both your question and your comment about the solution you found are somewhat fuzzy and I do not fully understand your requirement. E.g. you talked about mocking final and private static variables, which would also be possible in other ways with AspectJ by using set() and/or get() pointcuts. But actually it seems you do not need to mock the field contents, just stub the results of method calls upon the objects assigned to those fields.

How to use ResourceProcessorHandlerMethodReturnValueHandler in a spring-hateos project

When using spring-data-rest there is a post processing of Resource classes returned from Controllers (e.g. RepositoryRestControllers). The proper ResourceProcessor is called in the post processing.
The class responsible for this is ResourceProcessorHandlerMethodReturnValueHandler which is part of spring-hateoas.
I now have a project that only uses spring-hateoas and I wonder how to configure ResourceProcessorHandlerMethodReturnValueHandler in such a scenario. It looks like the auto configuration part of it still resides in spring-data-rest.
Any hints on how to enable ResourceProcessorHandlerMethodReturnValueHandler in a spring-hateoas context?
I've been looking at this recently too, and documentation on how to achieve this is non-existent. If you create a bean of type ResourceProcessorInvokingHandlerAdapter, you seem to lose the the auto-configured RequestMappingHandlerAdapter and all its features. As such, I wanted to avoid using this bean or losing the WebMvcAutoConfiguration, since all I really wanted was the ResourceProcessorHandlerMethodReturnValueHandler.
You can't just add a ResourceProcessorHandlerMethodReturnValueHandler via WebMvcConfigurer.addReturnValueHandlers, because what we need to do is actually override the entire list, as is what happens in ResourceProcessorInvokingHandlerAdapter.afterPropertiesSet:
#Override
public void afterPropertiesSet() {
super.afterPropertiesSet();
// Retrieve actual handlers to use as delegate
HandlerMethodReturnValueHandlerComposite oldHandlers = getReturnValueHandlersComposite();
// Set up ResourceProcessingHandlerMethodResolver to delegate to originally configured ones
List<HandlerMethodReturnValueHandler> newHandlers = new ArrayList<HandlerMethodReturnValueHandler>();
newHandlers.add(new ResourceProcessorHandlerMethodReturnValueHandler(oldHandlers, invoker));
// Configure the new handler to be used
this.setReturnValueHandlers(newHandlers);
}
So, without a better solution available, I added a BeanPostProcessor to handle setting the List of handlers on an existing RequestMappingHandlerAdapter:
#Component
#RequiredArgsConstructor
#ConditionalOnBean(ResourceProcessor.class)
public class ResourceProcessorHandlerMethodReturnValueHandlerConfigurer implements BeanPostProcessor {
private final Collection<ResourceProcessor<?>> resourceProcessors;
#Override
public Object postProcessAfterInitialization(Object bean, String beanName)
throws BeansException {
if (bean instanceof RequestMappingHandlerAdapter) {
RequestMappingHandlerAdapter requestMappingHandlerAdapter = (RequestMappingHandlerAdapter) bean;
List<HandlerMethodReturnValueHandler> handlers =
requestMappingHandlerAdapter.getReturnValueHandlers();
HandlerMethodReturnValueHandlerComposite delegate =
handlers instanceof HandlerMethodReturnValueHandlerComposite ?
(HandlerMethodReturnValueHandlerComposite) handlers :
new HandlerMethodReturnValueHandlerComposite().addHandlers(handlers);
requestMappingHandlerAdapter.setReturnValueHandlers(Arrays.asList(
new ResourceProcessorHandlerMethodReturnValueHandler(delegate,
new ResourceProcessorInvoker(resourceProcessors))));
return requestMappingHandlerAdapter;
}
else return bean;
}
}
This has seemed to work so far...

Spring MVC + Session Conversation (multiple tabs) + Spring Security with CSRF + Thymeleaf

In previous versions of Spring + Spring security, when I didnt use the built in CSFR, it was easy to add session conversation support (for supporting multiple "edit" tabs), using the techniques described at https://github.com/duckranger/Spring-MVC-conversation, specifically implementing a RequestDataValueProcessor
however, now that we use spring boot and all it's auto configure goodness, CsrfRequestDataValueProcessor now implements RequestDataValueProcessor
if I add my own RequestDataValueProcessor implemenation, it never gets used.
Can anyone point me in the right direction so that I can use both CsrfRequestDataValueProcessor and my RequestDataValueProcessor
I assume I would need to create a composite RequestDataValueProcessor, or can you have multiple RequestDataValueProcessor implemenations?
Thanks in advance
I just solved this very problem by a small modification to the duckranger solution. I wedged a call to a ScrfRequestDataValueProcessor into the getExtraHiddenFields method.
#Override
public Map<String, String> getExtraHiddenFields(HttpServletRequest request) {
CsrfRequestDataValueProcessor csrfRDVP = new CsrfRequestDataValueProcessor();
Map<String, String> hiddenFields = csrfRDVP.getExtraHiddenFields(request);
if (request.getAttribute(ConversationalSessionAttributeStore.CID_FIELD) != null) {
hiddenFields.put(ConversationalSessionAttributeStore.CID_FIELD,
request.getAttribute(ConversationalSessionAttributeStore.CID_FIELD).toString());
}
return hiddenFields;
}
Just implemented this today and have barely started user testing so USE AT YOUR OWN RISK.
I can't help but think there's a better way but this seems to do the trick.
Thought I would share what I did to get this working.
I needed a solution, and I just couldn't get my own RequestDataValueProcessor to be used by Thymeleaf, even though the bean was created in my configuration correctly.
Note: I am aware that this is quite a hack, but it is pretty interesting at the same time.
The Solution: AspectJ.....
#Aspect
public class SessionConversationAspect {
private final ConversationIdRequestProcessor conversationIdRequestProcessor;
public SessionConversationAspect() {
this.conversationIdRequestProcessor = new ConversationIdRequestProcessor();
}
#Pointcut("execution(* org.springframework.security.web.servlet.support.csrf.CsrfRequestDataValueProcessor.getExtraHiddenFields(..) ) && args(request) )")
protected void getExtraHiddenFields(HttpServletRequest request) {
}
#AfterReturning(
pointcut = "getExtraHiddenFields(request)",
returning = "hiddenFields"
)
protected void addExtraHiddenFields(HttpServletRequest request, Map<String, String> hiddenFields) {
Map<String, String> extraFields = conversationIdRequestProcessor.getExtraHiddenFields(request);
hiddenFields.putAll(extraFields);
}
}

Resources