#Header and spring stream functional programming model - spring-kafka

Is there a way to use #Header inside the following kafka consumer code ? I am using Spring Cloud Stream (Kafka Stream binder implementation), and there after my implemention is using functional model for example.
#Bean
public Consumer<KStream<String, Pojo>> process() {
return messages -> messages.foreach((k, v) -> process(v));
}
If using Spring for apache kafka then this can be as simple as
#KafkaListener(topics = "${mytopicname}", clientIdPrefix = "${myprefix}", errorHandler = "customEventErrorHandler")
public void processEvent(#Header(KafkaHeaders.RECEIVED_MESSAGE_KEY) String key,
#Header(KafkaHeaders.RECEIVED_PARTITION_ID) int partition,
#Header(KafkaHeaders.RECEIVED_TOPIC) String topic,
#Header(KafkaHeaders.RECEIVED_TIMESTAMP) long ts
#Valid Pojo pojo) {
...
// use headers here
...
}

No; the Kafka Streams binder is not based on Spring Messaging.
You can access headers, topic, and such in a Transformer (via the ProcessorContext) added to your stream.
You can use the Kafka Message Channel binder with
#Bean
public Consumer<Message<Pojo>> process() {
return message -> ...
}

Related

Spring Integration with Kafka not sending messages

I am working on Spring - Kafka using Java DSL and I see that the messages are not produced/sent to the kafka topic.
The code I have been using is:
#Bean
public IntegrationFlow sendToKafkaFlow() {
return IntegrationFlows.from(kafkaPublishChannel)
.handle(kafkaMessageHandler())
.get();
}
private KafkaProducerMessageHandlerSpec<String, Object, ?> kafkaMessageHandler() {
return Kafka
.outboundChannelAdapter(_kafkaProducerFactory.getKafkaTemplate().getProducerFactory())
.messageKey(m -> m
.getHeaders()
.getId())
//.headerMapper(mapper())
.topic(_topicConfiguration.getCheProgressUpdateTopic())
.configureKafkaTemplate(t -> t.getTemplate());
}
#Bean
public DefaultKafkaHeaderMapper mapper() {
return new DefaultKafkaHeaderMapper();
}
The producer configurations I am using are:
private ProducerFactory<String, Object> producerFactory() {
final Map<String, Object> producerProps = new HashMap<>();
producerProps.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProducerConfiguration.getKafkaServerProducerHost());
producerProps.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
producerProps.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
producerProps.put(ProducerConfig.ENABLE_IDEMPOTENCE_CONFIG, kafkaProducerConfiguration.isKafkaProducerIdempotentEnabled());
producerProps.put(ProducerConfig.ACKS_CONFIG, kafkaProducerConfiguration.getKafkaProducerAcks());
producerProps.put(ProducerConfig.RETRIES_CONFIG, kafkaProducerConfiguration.getKafkaProducerRetries());
producerProps.put(ProducerConfig.BATCH_SIZE_CONFIG, kafkaProducerConfiguration.getKafkaProducerBatchSize());
producerProps.put(ProducerConfig.LINGER_MS_CONFIG, kafkaProducerConfiguration.getKafkaProducerLingerMs());
return new DefaultKafkaProducerFactory<>(producerProps);
}
Not sure why I am not seeing the messages in the kafka topic. Can you please help me out here?
Try to use sync(true) on that Kafka.outboundChannelAdapter(). I believe there should be some errors if you don't see any progress during sending. You may also consider to use a DEBUG logging level for the org.springframework.integration category to see how your messages are traveling through your integration flow.

Building a reactive long polling API with webflux

I'm trying to use spring webflux to build a reactive long-polling API service, I have implemented a service that produces a data stream from my data resource(Kafka). And the caller of this long-polling API is supposed to get/wait for the first data with the requested key. But somehow the API is not returning the data if data received after the application gets the data. Am I using the correct reactor methods? Thanks so much!
My service:
#Service
public class KafkaService {
// the processor is a in-memory cache that keeps tracking of the old messages
#Autowired private KafkaMessageProcessor kafkaMessageProcessor;


#Autowired private ApplicationEventService applicationEventService;

private Flux<KafkaMessage> eventFlux;
#PostConstruct

public void setEventFlux() {

 this.eventFlux = Flux.create(applicationEventService).share();
}
public Flux<KafkaMessage> listenToKeyFlux(String key) {
return kafkaMessageProcessor

 .getAll()

 .concatWith(this.eventFlux.share())

 .filter(x -> x.getKey().equals(key));
}
}
My handler:
#RestController

#RequestMapping("/longpolling")
public class LongPollingController {


 private static final String MSG_PREFIX = "key_";


 #Autowired private KafkaService kafkaService;
#GetMapping("/message/{key}")

#CrossOrigin(allowedHeaders = "*", origins = "*")

public CompletableFuture<KafkaMessage> getMessage(#PathVariable String key) {

return kafkaService.listenToKeyFlux(MSG_PREFIX + key).shareNext().toFuture();

}
}

Spring cloud stream - Autowiring underlying Consumer for a given PollableMessageSource

Is it possible to get a hold of underlying KafkaConsumer bean for a defined PollableMessageSource?
I have Binding defined as:
public interface TestBindings {
String TEST_SOURCE = "test";
#Input(TEST_SOURCE)
PollableMessageSource testTopic();
}
and config class:
#EnableBinding(TestBindings.class)
public class TestBindingsPoller {
#Bean
public ApplicationRunner testPoller(PollableMessageSource testTopic) {
// Get kafka consumer for PollableMessageSource
KafkaConsumer kafkaConsumer = getConsumer(testTopic);
return args -> {
while (true) {
if (!testTopic.poll(...) {
Thread.sleep(500);
}
}
};
}
}
The question is, how can I get KafkaConsumer that corresponds to testTopic? Is there any way to get it from beans that are wired in spring cloud stream?
The KafkaMessageSource populates a KafkaConsumer into headers, so it is available in the place you receive messages: https://github.com/spring-projects/spring-kafka/blob/master/spring-kafka/src/main/java/org/springframework/kafka/support/converter/MessageConverter.java#L57.
If you are going to do stuff like poll yourself, I would suggest to inject a ConsumerFactory and use a consumer from there already.

Flink Retrofit not Serializable Exception

I have a Flink Job reading events from a Kafka queue then calling another service if certain conditions are met.
I wanted to use Retrofit2 to call the REST endpoint of that service but I get a is not Serializable Exception. I have several Flat Maps connected to each other (in series) then calling the service happens in the last FlatMap. The exception I get:
Exception in thread "main"
org.apache.flink.api.common.InvalidProgramException: The
implementation of the RichFlatMapFunction is not serializable. The
object probably contains or references non serializable fields.
...
Caused by: java.io.NotSerializableException: retrofit2.Retrofit$1
...
The way I am initializing retrofit:
RetrofitClient.getClient(BASE_URL).create(NotificationService.class);
And the NotificationService interface
public interface NotificationService {
#PUT("/test")
Call<String> putNotification(#Body Notification notification);
}
The RetrofitClient class
public class RetrofitClient {
private static Retrofit retrofit = null;
public static Retrofit getClient(String baseUrl) {
if (retrofit == null) {
retrofit = new Retrofit.Builder().baseUrl(baseUrl).addConverterFactory(GsonConverterFactory.create())
.build();
}
return retrofit;
}
Put your Notification class code for more details, but looks like this answer helps
java.io.NotSerializableException with "$1" after class

Spring Cloud Feign with OAuth2RestTemplate

I'm trying to implement Feign Clients to get my user info from the user's service, currently I'm requesting with oAuth2RestTemplate, it works. But now I wish to change to Feign, but I'm getting error code 401 probably because it doesn't carry the user tokens, so there is a way to customize, if Spring support for Feign is using, a RestTemplate so I can use my own Bean?
Today I'm implementing in this way
The service the client
#Retryable({RestClientException.class, TimeoutException.class, InterruptedException.class})
#HystrixCommand(fallbackMethod = "getFallback")
public Promise<ResponseEntity<UserProtos.User>> get() {
logger.debug("Requiring discovery of user");
Promise<ResponseEntity<UserProtos.User>> promise = Broadcaster.<ResponseEntity<UserProtos.User>>create(reactorEnv, DISPATCHER)
.observe(Promises::success)
.observeError(Exception.class, (o, e) -> Promises.error(reactorEnv, ERROR_DISPATCHER, e))
.filter(entity -> entity.getStatusCode().is2xxSuccessful())
.next();
promise.onNext(this.client.getUserInfo());
return promise;
}
And the client
#FeignClient("account")
public interface UserInfoClient {
#RequestMapping(value = "/uaa/user",consumes = MediaTypes.PROTOBUF,method = RequestMethod.GET)
ResponseEntity<UserProtos.User> getUserInfo();
}
Feign doesn't use a RestTemplate so you'd have to find a different way. If you create a #Bean of type feign.RequestInterceptor it will be applied to all requests, so maybe one of those with an OAuth2RestTemplate in it (just to manage the token acquisition) would be the best option.
this is my solution, just to complement the another answer with the source code, implementing the interface feign.RequestInterceptor
#Bean
public RequestInterceptor requestTokenBearerInterceptor() {
return new RequestInterceptor() {
#Override
public void apply(RequestTemplate requestTemplate) {
OAuth2AuthenticationDetails details = (OAuth2AuthenticationDetails)
SecurityContextHolder.getContext().getAuthentication().getDetails();
requestTemplate.header("Authorization", "bearer " + details.getTokenValue());
}
};
}

Resources