spring-boot embedded kafka issue : I am getting Invalid receive (size = 369296129 larger than 104857600) - integration-testing

I am writing integration test cases using spring boot, embedded kafka, and temporal. I am trying to send a message on a kafka topic.
#SpringBootTest(classes = Application.class)
#RunWith(SpringJUnit4ClassRunner.class)
#ActiveProfiles("test")
#DirtiesContext
#EmbeddedKafka(
partitions = 5,
controlledShutdown = true,
brokerProperties = {
"listeners=PLAINTEXT://localhost:9092",
"port=9092"
})
public class OutboundFlowIT {
private final Logger logger = LoggerFactory.getLogger(OutboundFlowIT.class);
private TestWorkflowEnvironment testEnv;
private Worker worker;
private WorkflowClient workflowClient;
#Autowired
private ActivityService activityService;
#Autowired
private EventSender sender;
#Before
public void setUp(){
// some setup code.
}
#Test
public void processOutboundFinancialMessage_shouldTriggerAllSteps_WhenOK() throws IOException,InterruptedException {
// logic for sending message to intended topic.
}
But I am getting below error.
org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 369296129 larger than 104857600)
at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:105) ~[kafka-clients-2.5.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.receive(KafkaChannel.java:447) ~[kafka-clients-2.5.1.jar:na]
at org.apache.kafka.common.network.KafkaChannel.read(KafkaChannel.java:397) ~[kafka-clients-2.5.1.jar:na]
at org.apache.kafka.common.network.Selector.attemptRead(Selector.java:678) ~[kafka-clients-2.5.1.jar:na]
at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:580) ~[kafka-clients-2.5.1.jar:na]
at org.apache.kafka.common.network.Selector.poll(Selector.java:485) ~[kafka-clients-2.5.1.jar:na]
at kafka.network.Processor.poll(SocketServer.scala:861) ~[kafka_2.12-2.5.1.jar:na]
at kafka.network.Processor.run(SocketServer.scala:760) ~[kafka_2.12-2.5.1.jar:na]
I have also added the below configurations in kafka.properties but I am getting the same issue as above.
spring.kafka.producer.properties.max.request.size=569296129
spring.kafka.consumer.properties.max.partition.fetch.bytes=369296129
I am new to kafka , please help me.

How do you send the messages to the kafka broker? You should use the kafka protocol and not using HTTP request, resemble issues states this is the error comes from

Related

contractVerifierMessaging.receive is null

I'm setting up contract tests for Kafka messaging with Test Containers in a way described in spring-cloud-contract-samples/producer_kafka_middleware/. Works good with Embedded Kafka but not with TestContainers.
When I try to run the generated ContractVerifierTest:
public void validate_shouldProduceKafkaMessage() throws Exception {
// when:
triggerMessageSent();
// then:
ContractVerifierMessage response = contractVerifierMessaging.receive("kafka-messages",
contract(this, "shouldProduceKafkaMessage.yml"));
Cannot invoke "org.springframework.messaging.Message.getPayload()" because "receive" is null
is thrown
Kafka container is running, the topic is created. When debugging receive method I see the message is null in the message(destination);
Contract itself:
label("triggerMessage")
input {
triggeredBy("triggerMessageSent()")
}
outputMessage {
sentTo "kafka-messages"
body(file("kafkaMessage.json"))
Base test configuration:
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE, classes = {TestConfig.class, ServiceApplication.class})
#Testcontainers
#AutoConfigureMessageVerifier
#ActiveProfiles("test")
public abstract class BaseClass {
What am I missing? Maybe a point of communication between the container and ContractVerifierMessage methods?
Resolved the issue by adding a specific topic name to listen() method in KafkaMessageVerifier implementation class.
So instead of #KafkaListener(id = "listener", topicPattern = ".*"), it works with:
#KafkaListener(topics = {"my-messages-topic"})
public void listen(ConsumerRecord payload, #Header(KafkaHeaders.RECEIVED_TOPIC)

Building a reactive long polling API with webflux

I'm trying to use spring webflux to build a reactive long-polling API service, I have implemented a service that produces a data stream from my data resource(Kafka). And the caller of this long-polling API is supposed to get/wait for the first data with the requested key. But somehow the API is not returning the data if data received after the application gets the data. Am I using the correct reactor methods? Thanks so much!
My service:
#Service
public class KafkaService {
// the processor is a in-memory cache that keeps tracking of the old messages
#Autowired private KafkaMessageProcessor kafkaMessageProcessor;


#Autowired private ApplicationEventService applicationEventService;

private Flux<KafkaMessage> eventFlux;
#PostConstruct

public void setEventFlux() {

 this.eventFlux = Flux.create(applicationEventService).share();
}
public Flux<KafkaMessage> listenToKeyFlux(String key) {
return kafkaMessageProcessor

 .getAll()

 .concatWith(this.eventFlux.share())

 .filter(x -> x.getKey().equals(key));
}
}
My handler:
#RestController

#RequestMapping("/longpolling")
public class LongPollingController {


 private static final String MSG_PREFIX = "key_";


 #Autowired private KafkaService kafkaService;
#GetMapping("/message/{key}")

#CrossOrigin(allowedHeaders = "*", origins = "*")

public CompletableFuture<KafkaMessage> getMessage(#PathVariable String key) {

return kafkaService.listenToKeyFlux(MSG_PREFIX + key).shareNext().toFuture();

}
}

Connect to non-standard couchbase ports for integration testing

I'm trying to set up automated integration-tests for a Spring Boot application that connects to couchbase. I want to test against a fresh containerised database using random available ports so that we don't need to worry about port collisions on developer machines or in continuous integration environments.
So far I've created a test that uses com.palantir.docker.compose.DockerComposeRule to spin up a couchbase docker image that exposes the standard ports on random available ports. However, org.springframework.boot.autoconfigure.couchbase.CouchbaseProperties doesn't expose any properties for ports.
I can see that org.springframework.data.couchbase.config.CouchbaseEnvironmentParser suggests that there is some way to set the ports. I haven't been able to work out how to use it though. Has anyone else been able to do this?
This is what my test looks like so far:
#Slf4j
#RunWith(SpringRunner.class)
#SpringBootTest
#Category(IntegrationTest.class)
public class BiscuitRepositoryIT {
private static final int COUCHBASE_PORT = 8091;
private static final String COUCHBASE_SERVICE = "couchbase";
#ClassRule
public static DockerComposeRule docker = DockerComposeRule.builder()
.file("src/test/resources/docker-compose-couchbase.yml")
.projectName(ProjectName.random())
.waitingForService(COUCHBASE_SERVICE, HealthChecks.toHaveAllPortsOpen())
.build();
#Autowired
private BiscuitRepositoryConfig config;
#Autowired
private BiscuitRepository biscuitRepository;
#BeforeClass
public static void initialize() {
DockerPort couchbase = docker.containers()
.container(COUCHBASE_SERVICE)
.port(COUCHBASE_PORT);
log.info("couchbase ports: {}", couchbase);
// TODO update integration-test properties file with the ports
// before the spring context starts.
}
#After
public void tearDown() throws Exception {
biscuitRepository.deleteAll();
}
#Test
public void insertBiscuit() throws Exception {
Biscuit digestive = Biscuit.builder().name("digestive").manufacturer("biscuitCorp").build();
Biscuit persistedBiscuit = biscuitRepository.save(digestive);
assertThat(persistedBiscuit).isEqualToIgnoringGivenFields(digestive, "id", "version");
}
}

spring boot kafka consumer application to implement heartbeat

Below is my spring boot kafka consumer application to read data from kafka topic. In this application we are planning to implement heartbeat funtionally to post its heartbeat to url using #schduling annotaion to know its alive and running(which loads my json input data to db). purpose of this post request is to update the status on application monitoring tool.
to achive this i placed my heartbeat code to in manyplaces of my application but
I could'not able to achive this becuase #postconstuct or consumer.poll() is not allowing to run the heartbeat code piece.
we are using apache kafka 2.12, What could be the right approach to implment this behaviour in my spring boot app? Is their any other api to do such post request to url, every few miuntes through out the application.? Writing background thread will resolve this issue, please share any? why postconstuct() or poll() is blocking other recurresive code to run.
Please help me. Thanks in advance.
#SpringBootApplication
#EnableScheduling
public class KafkaApp {
#Autowired
ConsumerService kcService;
public static void main(String[] args) {
SpringApplication.run(KafkaApp.class, args);
}
#PostConstruct
public void init(){
kcService.getMessagesFromKafka();
}
}
and 2 #Service Definitions:
import org.apache.kafka.clients.consumer.Consumer;
#Service public class ConsumerService {
final Consumer<Long, String> consumer = createConsumer();
final int giveUp = 100;
int noRecordsCount = 0;
while (true) {
final ConsumerRecords<Long, String> consumerRecords = consumer.poll(1000);
if (consumerRecords.count()==0) {
noRecordsCount++;
if (noRecordsCount > giveUp) break;
else continue;
}
consumerRecords.forEach(record -> {
System.out.printf("Consumer Record:(%d, %s, %d, %d)\n",
record.key(), record.value(),
record.partition(), record.offset());
});
consumer.commitAsync();
}
}
#Scheduled(fixedDelay = 180000)
public void heartbeat() {
RestTemplate restTemplate = new RestTemplate();
String url = "endpoint url";
String requestJson = "{\"I am alive\":\"App name?\"}";
HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
HttpEntity<String> entity = new HttpEntity<String>(requestJson,headers);
String answer = restTemplate.postForObject(url, entity, String.class);
System.out.println(answer);
}
Add annotation to your main class like:
#SpringBootApplication
#EnableScheduling
public class KafkaApp {
#Autowired
ConsumerService kcService;
public static void main(String[] args) {
SpringApplication.run(KafkaApp.class, args);
}
#PostConstruct
public void init(){
kcService.getMessagesFromKafka();
}
}
For more detail,spring-boot-task-scheduling-with-scheduled-annotation you can visit this link:
If you want to write a cron job for this purpose then in application.properties add this:
cron.expression=5 0 0 ? * * * //Its means it'll execute every 5 sec
You can make cron expression online here is a link:cron-expression-generator-quartz.
And in your heart beat function write this above function like:
#Scheduled(cron = "${cron.expression}")
public void heartbeat() {
//Your code here.
}

Spring Cloud Contract Consumer Test Issue

I am testing the consumer side of the spring cloud contract.
The provider is here: https://github.com/pkid/spring-cloud-contract-with-surefire.
The stubs jar generated from the provider is here: https://github.com/pkid/spring-cloud-contract-with-surefire-consumer/blob/master/sample-repo-service-1.0.0-SNAPSHOT-stubs.jar
When I run the consumer test(source is here: https://github.com/pkid/spring-cloud-contract-with-surefire-consumer):
#Test
public void shouldGiveFreeSubscriptionForFriends() throws Exception {
mockMvc.perform(MockMvcRequestBuilders.get("/greeting")
.contentType(MediaType.APPLICATION_JSON))
.andExpect(status().isOk())
.andExpect(content().string("{\"id\":1,\"content\":\"Hello, World!\"}"));
}
When I do "mvn test", I can see that the stubs jar is correctly found and unpacked. However I got the error that the endpoint 2 "/greeting" does not exist(404).
Could you please help me? Thank you!
You are using mockMvc to connect to a WireMock instance. That won't work. Change mockMvc on the consumer side to a restTemplate
#RunWith(SpringRunner.class)
#SpringBootTest(webEnvironment = WebEnvironment.MOCK)
#AutoConfigureMockMvc
#AutoConfigureJsonTesters
#DirtiesContext
#AutoConfigureStubRunner(ids = {"com.sap.ngp.test:sample-repo-service:+:stubs:8080"}, workOffline = true)
public class ConsumerTest {
#Test
public void shouldGiveFreeSubscriptionForFriends() throws Exception {
ResponseEntity<String> result = new TestRestTemplate().exchange(RequestEntity
.get(URI.create("http://localhost:8080/greeting"))
.header("Content-Type", MediaType.APPLICATION_JSON_VALUE)
.build(), String.class);
BDDAssertions.then(result.getStatusCode().value()).isEqualTo(200);
BDDAssertions.then(result.getBody()).isEqualTo("{\"content\":\"Hello, World!\"}");
}
}
Please read what mock mvc is for in the docs http://docs.spring.io/spring-security/site/docs/current/reference/html/test-mockmvc.html

Resources