EmbeddedKafka w/ ContainerTestUtils.waitForAssignment throws: Expected 1 but got 0 partitions - spring-kafka

We have an integration test where we use EmbeddedKafka and produce a message to a topic, our app processes that message, and the result is sent to a second topic where we consume and assert the output. In CI this works maybe 2/3 of the time, but we will hit cases where KafkaTestUtils.getSingleRecord throws java.lang.IllegalStateException: No records found for topic (See [1] below).
To try and resolve this, I added ContainerTestUtils.waitForAssignment for each listener container in the registry (See [2] below). After a few successful runs in CI, I saw a new exception: java.lang.IllegalStateException: Expected 1 but got 0 partitions. This now has me wondering if this was actually the root cause of the original exception of no records found.
Any ideas what could help with the random failures here? I would appreciate any suggestions on how to troubleshoot.
spring-kafka and spring-kafka-test v2.6.4.
Edit: Added newConsumer for reference.
Example of our setup:
#SpringBootTest
#RunWith(SpringRunner.class)
#DirtiesContext
#EmbeddedKafka(
topics = { "topic1","topic2" },
partitions = 1,
brokerProperties = {"listeners=PLAINTEXT://localhost:9099", "port=9099"})
public class IntegrationTest {
#Autowired
private EmbeddedKafkaBroker embeddedKafkaBroker;
#Autowired
private KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#Test
public void testExample() {
try (Consumer<String, String> consumer = newConsumer()) {
for (MessageListenerContainer messageListenerContainer : kafkaListenerEndpointRegistry.getListenerContainers()) {
[2]
ContainerTestUtils.waitForAssignment(messageListenerContainer, embeddedKafkaBroker.getPartitionsPerTopic());
}
try (Producer<String, String> producer = newProducer()) {
embeddedKafkaBroker.consumeFromAnEmbeddedTopic(consumer, "topic2"); // [1]
producer.send(new ProducerRecord<>(
"topic1",
"test payload"));
producer.flush();
}
String result = KafkaTestUtils.getSingleRecord(consumer, "topic2").value();
assertEquals(result, "expected result");
}
}
private Consumer<String, String> newConsumer() {
Map<String, Object> consumerProps = KafkaTestUtils.consumerProps("groupId", "false", embeddedKafkaBroker);
ConsumerFactory<String, AssetTransferResponse> consumerFactory = new DefaultKafkaConsumerFactory<>(
consumerProps,
new StringDeserializer(),
new CustomDeserializer<>());
return consumerFactory.createConsumer();
}
}

Related

Spring Kafka : Skip error message using CommonErrorHandler

I am using spring-kafka 2.8.9 and kafka-clients 2.8.1 . I want to skip a message which is failed to de-serialize . Since setErrorHandler is deprecated , I tried using CommonErrorHandler . But I am not sure how to skip current error message and move to next record . The only option I can see is using pattern matching by extracting relevant details from below line like offset and partition .
org.apache.kafka.common.errors.SerializationException: Error deserializing key/value for partition test-0 at offset 1. If needed, please seek past the record
Is there any other way like RecordDeserializationException to get necessary information from the exception or any other means without pattern matching . I can not upgrade to kafka 3.X.X .
My config
#Bean
public ConsumerFactory<String, Farewell> farewellConsumerFactory()
{
groupId = LocalTime.now().toString();
Map<String, Object> props = new HashMap<>();
props.put( ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapAddress);
props.put( ConsumerConfig.GROUP_ID_CONFIG, groupId);
props.put(JsonDeserializer.TRUSTED_PACKAGES, "*");
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG,"earliest");
return new DefaultKafkaConsumerFactory<>(props,new StringDeserializer(),new JsonDeserializer<>(Farewell.class));
}
#Bean
public ConcurrentKafkaListenerContainerFactory<String, Farewell> farewellKafkaListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, Farewell> factory = new ConcurrentKafkaListenerContainerFactory<>();
factory.setCommonErrorHandler(new CommonErrorHandler()
{
#Override
public void handleOtherException(Exception thrownException, Consumer<?, ?> consumer, MessageListenerContainer container, boolean batchListener)
{
CommonErrorHandler.super.handleOtherException(thrownException, consumer, container, batchListener);
}
});
factory.setConsumerFactory(farewellConsumerFactory());
return factory;
}
My listener class
#KafkaListener(topics = "${topicId}",
containerFactory = "farewellKafkaListenerContainerFactory")
public void farewellListener(Farewell message) {
System.out.println("Received Message in group " + groupId + "| " + message);
}
Domain class
public class Farewell {
private String message;
private Integer remainingMinutes;
public Farewell(String message, Integer remainingMinutes)
{
this.message = message;
this.remainingMinutes = remainingMinutes;
}
// standard getters, setters and constructor
}
I have checked these links
How to skip a msg that have error in kafka when i use ConcurrentMessageListenerContainer?
Better way of error handling in Kafka Consumer
Use an ErrorHandlingDeserializer as a wrapper around your real deserializer.
Serialization exceptions will be sent directly to the DefaultErrorHandler, which treats such exceptions as fatal (by default) and sends them directly to the recoverer.

Spring Cloud Contract generated tests fails on empty responses from producer

I am trying to implement Spring Cloud Contract to my project. I am following instructions from this baeldung article:
https://www.baeldung.com/spring-cloud-contract
Dependencies are added
Plugin is configured
Producer contract is defined
BaseTest is defined
Unfortunately my generated tests fails because the response (jsonBody) is "empty"
Here's a few pieces of the setup:
BaseContractTest =>
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.MOCK)
#AutoConfigureMessageVerifier
#DirtiesContext
#AutoConfigureStubRunner(ids = "com.example:producer-service:+:stubs:8080",
stubsMode = StubRunnerProperties.StubsMode.LOCAL)
public class BaseContractTest
{
#Autowired
private WebApplicationContext webApplicationContext;
#BeforeEach
void setUp()
{
final DefaultMockMvcBuilder defaultMockMvcBuilder =
MockMvcBuilders.webAppContextSetup(webApplicationContext);
defaultMockMvcBuilder.apply(
springSecurity((request, response, chain) -> chain.doFilter(request, response)));
RestAssuredMockMvc.mockMvc(defaultMockMvcBuilder.build());
}
contract =>
Contract.make {
description "GetCustomer should return a Customer"
request {
method GET()
url value(consumer(regex('/producer-service/v1/customer/ID-\\d*-\\d*')), producer("/producer-service/customer/ID-132456-9876"))
}
response {
status OK()
body(
id: "ID-132456-9876", name: "exampleName"
)
headers {
contentType(applicationJson())
}
}
}
wiremocks mapping are properly generated (omitted for brevity)
Generated ContractTest =>
public class ContractVerifierTest extends BaseContractTest {
#Test
public void validate_shouldReturnACustomer() throws Exception {
// given:
MockMvcRequestSpecification request = given();
// when:
ResponseOptions response = given().spec(request)
.get("/producer-service/v1/ID-132456-9876");
// then:
assertThat(response.statusCode()).isEqualTo(200);
assertThat(response.header("Content-Type")).matches("application/json.*");
// and:
DocumentContext parsedJson = JsonPath.parse(response.getBody().asString());
assertThatJson(parsedJson).field("['id']").isEqualTo("ID-132456-9876");
assertThatJson(parsedJson).field("['name']").isEqualTo("exampleName");
}
}
when the test runs, it fails with this error:
validate_shouldReturnACustomer Time elapsed: 0.731 s <<< FAILURE!
java.lang.AssertionError:
Expecting actual not to be null
at com.example.contracts.ContractVerifierTest.validate_shouldReturnACustomer(ContractVerifierTest.java:31)
When I look up the corresponding error line, it fails on =>
assertThat(response.header("Content-Type")).matches("application/json.*");
I am a bit clueless at this point.
I tried to use the MockStandaloneApp tied to the controller (as per the link to baeldung) but that did not help.
Note that the service returns a Mono<Customer> not an actual Customer, if that changes anything.

How to get threadlocal for concurrency consumer?

I am developing spring kafka consumer. Due to message volume, I need use concurrency to make sure throughput. Due to used concurrency, I used threadlocal object to save thread based data. Now I need remove this threadlocal object after use it.
Spring document with below links suggested to implement a EventListener which listen to event ConsumerStoppedEvent . But did not mention any sample eventlistener code to get threadlocal object and remove the value. May you please let me know how to get the threadlocal instance in this case?
Code samples will be appreciated.
https://docs.spring.io/spring-kafka/docs/current/reference/html/#thread-safety
Something like this:
#SpringBootApplication
public class So71884752Application {
public static void main(String[] args) {
SpringApplication.run(So71884752Application.class, args);
}
#Bean
public NewTopic topic2() {
return TopicBuilder.name("topic1").partitions(2).build();
}
#Component
static class MyListener implements ApplicationListener<ConsumerStoppedEvent> {
private static final ThreadLocal<Long> threadLocalState = new ThreadLocal<>();
#KafkaListener(topics = "topic1", groupId = "my-consumer", concurrency = "2")
public void listen() {
long id = Thread.currentThread().getId();
System.out.println("set thread id to ThreadLocal: " + id);
threadLocalState.set(id);
}
#Override
public void onApplicationEvent(ConsumerStoppedEvent event) {
System.out.println("Remove from ThreadLocal: " + threadLocalState.get());
threadLocalState.remove();
}
}
}
So, I have two concurrent listener containers for those two partitions in the topic. Each of them is going to call this my #KafkaListener method anyway. I store the thread id into the ThreadLocal. For simple use-case and testing the feature.
The I implement ApplicationListener<ConsumerStoppedEvent> which is emitted in the appropriate consumer thread. And that one helps me to extract ThreadLocal value and clean it up in the end of consumer life.
The test against embedded Kafka looks like this:
#SpringBootTest
#EmbeddedKafka(bootstrapServersProperty = "spring.kafka.bootstrap-servers")
#DirtiesContext
class So71884752ApplicationTests {
#Autowired
KafkaTemplate<String, String> kafkaTemplate;
#Autowired
KafkaListenerEndpointRegistry kafkaListenerEndpointRegistry;
#Test
void contextLoads() throws InterruptedException {
this.kafkaTemplate.send("topic1", "1", "foo");
this.kafkaTemplate.send("topic1", "2", "bar");
this.kafkaTemplate.flush();
Thread.sleep(1000); // Give it a chance to consume data
this.kafkaListenerEndpointRegistry.stop();
}
}
Right. It doesn't verify anything, but it demonstrate how that event can happen.
I see something like this in log output:
set thread id to ThreadLocal: 125
set thread id to ThreadLocal: 127
...
Remove from ThreadLocal: 125
Remove from ThreadLocal: 127
So, whatever that doc says is correct.

Spring Boot MVC Test 404 with Valid Request

I am using Spring Boot 2.0.6 and have set up a test for a controller: the method is as follows:
#Secured("ROLE_ADMIN")
#GetMapping(value = {"/maintainers/aircrafts/workorders/workitems/{wid}/parts"}, produces = "application/json")
#ResponseStatus(value = HttpStatus.OK)
Response<Page<WorkItem>> getPagedParts(
#PathVariable("wid") Optional<Long> workItemId,
#PageableDefault(page = DEFAULT_PAGE_NUMBER, size = DEFAULT_PAGE_SIZE)
#SortDefault.SortDefaults({
#SortDefault(sort = "partName", direction = Sort.Direction.ASC),
#SortDefault(sort = "partSpecification", direction = Sort.Direction.ASC)
}) Pageable pageable) {
LOG.info("looking for work: {}", workItemId);
return Response.of(workItemService.findAllPartsForWorkItem(workItemId.get(), pageable));
}
As you can see, it is supposed to do paging and sorting, but it doesn't even get past the path:
The test that tests it is as follows:
#ActiveProfiles("embedded")
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
#EnableConfigurationProperties
#EnableJpaRepositories({ "au.com.avmaint.api" })
#AutoConfigureMockMvc
public class WorkItemControllerPartsFunctionalTest {
private static final Logger LOG = LoggerFactory.getLogger(WorkItemControllerFunctionalTest.class);
private String adminJwtToken;
#Autowired
private WebApplicationContext context;
#Autowired
private MockMvc mvc;
#Autowired
private UserService userService;
#Autowired
private RoleService roleService;
#Autowired
private CasaService casaService;
#Autowired
private MaintainerService maintainerService;
#Autowired
private MaintenanceContractService maintenanceContractService;
#Autowired
private WorkSetService workSetService;
#Autowired
private WorkSetTemplateService workSetTemplateService;
#Autowired
private AircraftService aircraftService;
Maintainer franks;
MaintenanceContract contract;
#Before
public void setup() {
mvc = MockMvcBuilders
.webAppContextSetup(context)
.apply(springSecurity())
.build();
franks = MaintainerFixtures.createFranksMaintainer(maintainerService, maintenanceContractService, casaService);
adminJwtToken = UserAndRoleFixtures.adminToken(userService, roleService, franks);
contract = WorkItemFixtures.makeDetailedJobOnContract(franks, maintainerService, maintenanceContractService, workSetTemplateService, casaService, aircraftService);
}
#Test
public void findingWorkItemsWithoutParts() throws Exception {
Set<WorkSet> sets = contract.getWorkOrders().stream().findFirst().get().getWorkSets();
WorkSet hundredHourly = sets.stream().filter(s -> s.getName().equals("100 Hourly for PA-31")).findFirst().orElse(null);
WorkItem opening = hundredHourly.getWorkItems().stream().filter(wi -> wi.getTitle().equals("Opening the aircraft")).findFirst().orElse(null);
LOG.info("opening item: {}", opening);
LOG.info("HUNDRED: {}", hundredHourly);
mvc.perform(get("/maintainers/aircrafts/workorders/workitems/" + opening.getId() + "/parts")
.header(AUTHORIZATION_HEADER, "Bearer " + adminJwtToken))
.andDo(print())
.andExpect(status().isOk())
.andExpect(jsonPath("$.payload").isNotEmpty())
.andExpect(jsonPath("$.payload.content").isNotEmpty())
.andExpect(jsonPath("$.payload.pageable").isNotEmpty())
.andExpect(jsonPath("$.payload.last").value(false))
.andExpect(jsonPath("$.payload.totalPages").value(3)) // page count
.andExpect(jsonPath("$.payload.totalElements").value(9)) // total count
.andExpect(jsonPath("$.payload.size").value(4)) // elements per page
.andExpect(jsonPath("$.payload.numberOfElements").value(4)) // elements in page
.andExpect(jsonPath("$.payload.number").value(0)) // current page number
.andExpect(jsonPath("$.payload.content").isArray())
// oops, lets not check dates, they're created on the instant
.andExpect(jsonPath("$.payload.content[0].pos").value("1"))
.andExpect(jsonPath("$.payload.content[0].title").value("Opening the aircraft"))
.andExpect(jsonPath("$.payload.content[0].category").value("AIRFRAME"))
;
}
#After
public void tearDown() {
MaintainerFixtures.removeFranks(franks, maintainerService, aircraftService);
WorkItemFixtures.killJobs(workSetService, workSetTemplateService);
UserAndRoleFixtures.killAllUsers(userService, roleService);
}
}
As the project makes extensive use of JPA, there are annotations and a lot of data setup, but all of this has worked fine with other tests and there don't appear to be any problems with the data. In fact a peek at the JSON output for the work order that this method should be querying...
work order JSON
Basically has all the data correctly set up. The spring boot startup includes this line:
2018-11-12 06:32:17.362 INFO 83372 --- [ main] s.w.s.m.m.a.RequestMappingHandlerMapping : Mapped "{[/api/maintainers/aircrafts/workorders/workitems/{wid}/parts],methods=[GET],produces=[application/json]}" onto au.com.avmaint.api.common.Response<org.springframework.data.domain.Page<au.com.avmaint.api.aircraft.model.WorkItem>> au.com.avmaint.api.aircraft.WorkItemController.getPagedParts(java.util.Optional<java.lang.Long>,org.springframework.data.domain.Pageable)
So the path appears to be OK
and now to the .andDo(print()) output:
MockHttpServletRequest:
HTTP Method = GET
Request URI = /maintainers/aircrafts/workorders/workitems/5/parts
Parameters = {}
Headers = {Authorization=[Bearer eyJhbGciOiJIUzUxMiJ9.eyJzdWIiOiJmcmFua0BmcmFua3MuY29tIiwic2NvcGVzIjpbIlJPTEVfQURNSU4iLCJST0xFX0JBU0lDIl0sImV4cCI6MTU0MjgyODczOH0.QOTiyWG_pVL9qb8MDG-2c_nkTnsIzceUH-5vvtmpZhBcdro9HqVADojK0-c6B1sAOOYOcprpwg4-wrBF0PGweg]}
Body = <no character encoding set>
Session Attrs = {}
Handler:
Type = org.springframework.web.servlet.resource.ResourceHttpRequestHandler
Async:
Async started = false
Async result = null
Resolved Exception:
Type = null
ModelAndView:
View name = null
View = null
Model = null
FlashMap:
Attributes = null
MockHttpServletResponse:
Status = 404
Error message = null
Headers = {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY]}
Content type = null
Body =
Forwarded URL = null
Redirected URL = null
Cookies = []
and the 404. So I guess I'm breaking something somewhere, I just can't see what it is, can anyone help with this?
Sorry everyone, the effect of tearing my hair out for ages, finally posting the question and then finding the problem moments later.
The issue was that I forgot to put /api as the prefix on the path in the test. This prefix is put on the top of every controller with:
#RestController
#RequestMapping("/api")
public class WorkItemController {
so, yeah: it works now

spring-integration-dsl: Make Feed-Flow Work

I'm trying to code a RSS-feed reader with a configured set of RSS-feeds. I thought that a good approach is to solve that by coding a prototype-#Bean and call it with each RSS-feed found in the configuration.
However, I guess that I'm missing a point here as the application launches, but nothing happens. I mean the beans are created as I'd expect, but there is no logging happening in that handle()-method:
#Component
public class HomeServerRunner implements ApplicationRunner {
private static final Logger logger = LoggerFactory.getLogger(HomeServerRunner.class);
#Autowired
private Configuration configuration;
#Autowired
private FeedConfigurator feedConfigurator;
#Override
public void run(ApplicationArguments args) throws Exception {
List<IntegrationFlow> feedFlows = configuration.getRssFeeds()
.entrySet()
.stream()
.peek(entry -> System.out.println(entry.getKey()))
.map(entry -> feedConfigurator.feedFlow(entry.getKey(), entry.getValue()))
.collect(Collectors.toList());
// this one appears in the log-file and looks good
logger.info("Flows: " + feedFlows);
}
}
#Configuration
public class FeedConfigurator {
private static final Logger logger = LoggerFactory.getLogger(FeedConfigurator.class);
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
public IntegrationFlow feedFlow(String name, FeedConfiguration configuration) {
return IntegrationFlows
.from(Feed
.inboundAdapter(configuration.getSource(), getElementName(name, "adapter"))
.feedFetcher(new HttpClientFeedFetcher()),
spec -> spec.poller(Pollers.fixedRate(configuration.getInterval())))
.channel(MessageChannels.direct(getElementName(name, "in")))
.enrichHeaders(spec -> spec.header("feedSource", configuration))
.channel(getElementName(name, "handle"))
//
// it would be nice if the following would show something:
//
.handle(m -> logger.debug("Payload: " + m.getPayload()))
.get();
}
private String getElementName(String name, String postfix) {
name = "feedChannel" + StringUtils.capitalize(name);
if (!StringUtils.isEmpty(postfix)) {
name += "." + postfix;
}
return name;
}
}
What's missing here? It seems as if I need to "start" the flows somehow.
Prototype beans need to be "used" somewhere - if you don't have a reference to it anywhere, no instance will be created.
Further, you can't put an IntegrationFlow #Bean in that scope - it generates a bunch of beans internally which won't be in that scope.
See the answer to this question and its follow-up for one technique you can use to create multiple adapters with different properties.
Alternatively, the upcoming 1.2 version of the DSL has a mechanism to register flows dynamically.

Resources