I am trying to merge several Map into a single map, for doing this I wrote this piece of code.
public static Map<String, Long> mergeMaps(List<Map<String, Long>> maps){
Map<String, Long> mergedMap = new TreeMap<>();
maps.forEach( map -> {
//Map<String, Long> mx = new TreeMap<>(map);
map.forEach((key, value) -> mergedMap.merge(key, value, Long::sum));
});
logger.info("Merge map successful");
return mergedMap;
}
On doing this the code is throwing an error :
org.springframework.batch.core.step.AbstractStep java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684) ~[?:1.8.0_201-1-ojdkbuild]
lambda$mergeMaps$1
I have this pojo
public class MappingDto {
public MappingDto(){
map1 = new LinkedHashMap<>();
map2 = new LinkedHashMap<>();
map3 = new LinkedHashMap<>();
}
private Map<String, Long> map1;
private Map<String, Long> map2;
private Map<String, Long> map3;
getters()/Setters()
toString()
}
I populate this pojo maps with the below codes.
MappingDto.setMap1(Arrays.asList(doc.getDescription().split("\\s+")).stream().collect(Collectors.groupingBy(Function.identity(),LinkedHashMap::new,Collectors.counting())));
after generating this I have to merge this map with the some Stored in File, To read the Map I have the below method.
public static <T> T readJsonFile(String filePath, T t) throws JAXBException, IOException, IllegalAccessException, InstantiationException {
ObjectMapper mapper = new ObjectMapper();
if(!Utils.checkIfFileExists(filePath)){
Utils.createANewFile(filePath);
return t;
}
return mapper.readValue(new File(filePath), new TypeReference<T>(){});
}
After getting this maps I pass it to the
mergeMaps(Arrays.asList(MappingDto.getMap1(),storedData));
and
List<Map<String, Long>> dataList = new LinkedList<>();
dataList.add(MappingDto.getMap1());
dataList.add(MappingDto.getMap2());
dataList.add(MappingDto.getMap3());
mergeMaps(dataList);
I went through this code several time, But not able to figure what exactly is causing this issue.
One is reading the Stored JSON map
public static <T> T readJsonFile(String filePath, Class<T> t) throws JAXBException, IOException, IllegalAccessException, InstantiationException {
ObjectMapper mapper = new ObjectMapper();
if(!Utils.checkIfFileExists(filePath)){
Utils.createANewFile(filePath);
return t.newInstance();
}
return mapper.readValue(new File(filePath), new TypeReference<T>(){});
}
I tried this and several other variations but no results.
public static Map<String, Long> mergeMaps(List<Map<String, Long>> maps){
Map<String, Long> mergedMap = new TreeMap<>();
maps.forEach( map -> {
//Map<String, Long> mx = new TreeMap<>(map);
map.forEach((key, value) -> mergedMap.merge(key, Long.valueOf(value), Long::sum));
});
logger.info("Merge map successful");
return mergedMap;
}
Can you please help me understand what is in Integer and why is being cast to Long
Thanks in advance.
Related
We are having several microservices in our product, there are some business use cases where one microservice (TryServiceOne) have to delegate request to another microserice (TryServiceThree). For this end user is waiting for response from API. So we used ReplyingKafkaTemplate So that we can instantly respond back to Caller. Everything seems to be working, but we are seeing LAGs in REPLY Topic which is causing our Alert system to bombard with alerts. But behind the scenes messages are getting read by RequestReplyFuture and processed successfully lag is keep increasing from Kafka broker. Please suggest how to avoid LAGs.
IMPORTANT
We are using cluster deployment of microsrvices with more than one node. Hence we are using Custom Partitioning to assign response/ reply topic to one partition all the time.
TryServiceOne
KafkaConfiguration.class
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(org.apache.kafka.clients.producer.ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBootstrapServers);
props.put(org.apache.kafka.clients.producer.ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(org.apache.kafka.clients.producer.ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
return props;
}
#Bean
public Map<String,Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBootstrapServers);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, consumerGroupId);
return props;
}
#Bean
public ProducerFactory<String, RequestModel> requestProducerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
#Bean
public KafkaTemplate<String, RequestModel> kafkaTemplate() {
return new KafkaTemplate<>(requestProducerFactory());
}
#Bean
public ReplyingKafkaTemplate<String, RequestModel, ResponseModel> replyKafkaTemplate(ProducerFactory<String, RequestModel> pf,
KafkaMessageListenerContainer<String, ResponseModel> container){
return new ReplyingKafkaTemplate<>(pf, container);
}
#Bean
public KafkaMessageListenerContainer<String, ResponseModel> replyContainer(ConsumerFactory<String, ResponseModel> cf) {
TopicPartitionOffset topicPartitionOffset = new TopicPartitionOffset("RESPONSE_TOPIC",0);
ContainerProperties containerProperties = new ContainerProperties(topicPartitionOffset);
containerProperties.setAckMode(ContainerProperties.AckMode.MANUAL);
return new KafkaMessageListenerContainer<>(cf, containerProperties);
}
My SendAndReceive Service Component looks like below
RequestModel requestModel= new RequestModel();
distributorRequestEvent.setDistributorModel(producerRecord);
// create producer record
ProducerRecord<String, RequestModel> record = new ProducerRecord<String, RequestModel>("REQUEST_TOPIC", requestModel);
// set reply topic in header
record.headers().add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, "RESPONSE_TOPIC".getBytes(StandardCharsets.UTF_8)));
kafkaTemplate.setDefaultReplyTimeout(Duration.ofSeconds(30));
LOGGER.info("Sending message ... {}",producerRecord);
RequestReplyFuture<String, RequestModel, ResponseModel> sendAndReceive = kafkaTemplate.sendAndReceive(record);
// confirm if producer produced successfully
SendResult<String, RequestModel> sendResult = sendAndReceive.getSendFuture().get();
// get consumer record
ConsumerRecord<String, ResponseModel> consumerRecord = sendAndReceive.get();
return consumerRecord.value();
TryServiceThree Microservice
Kafka Configuration
#Bean
public Map<String, Object> consumerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBootstrapServers);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.GROUP_ID_CONFIG, consumerGroupId);
props.put(JsonDeserializer.TYPE_MAPPINGS,RequestModel.class);
return props;
}
#Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBootstrapServers);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, JsonSerializer.class);
props.put(ProducerConfig.PARTITIONER_CLASS_CONFIG,CustomPartitioner.class);
props.put(ProducerConfig.ACKS_CONFIG, "all");
return props;
}
#Bean
public ConsumerFactory<String, RequestModel> requestConsumerFactory() {
JsonDeserializer<RequestModel> deserializer = new JsonDeserializer<>(RequestModel.class);
deserializer.setRemoveTypeHeaders(false);
deserializer.addTrustedPackages("*");
deserializer.setUseTypeMapperForKey(true);
return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),
deserializer);
}
#Bean
public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String, RequestModel>> requestListenerContainerFactory() {
ConcurrentKafkaListenerContainerFactory<String, RequestModel> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(requestConsumerFactory());
// factory.setMessageConverter(new JsonMessageConverter());
factory.setReplyTemplate(replyTemplate());
return factory;
}
#Bean
public ProducerFactory<String, ResponseModel> replyProducerFactory() {
ProducerFactory<String, ResponseModel> producerFactory = new DefaultKafkaProducerFactory<>(producerConfigs());
return producerFactory;
}
#Bean
public KafkaTemplate<String, ResponseModel> replyTemplate() {
return new KafkaTemplate<>(replyProducerFactory());
}
CustomPartitioning on TryServiceThree
public class CustomPartitioner implements Partitioner {
#Override
public int partition(String s, Object o, byte[] bytes, Object o1, byte[] bytes1, Cluster cluster) {
return 0;
}
#Override
public void close() {
}
#Override
public void configure(Map<String, ?> map) {
}
Use
containerProperties.setAckMode(ContainerProperties.AckMode.BATCH);
in the reply container.
The messages are getting consumed from kafka topic using json deserializer(spring commons). The generic messages structure as below.
GenericEvent:
{
"id": "10000",
"payload": {
"id": 100
"attribute1": "hi",
"attribute2": "hello"
},
"type": {
"id" : 1,
"name" : "A"
}
}
Different types has different payload and the structure of the payload also will be varied. So i would like to process the payload based on the type.
My respective POJO is as below, and total 3 different payloads and respective payload pojos has been created.
GenericEvent {
private int id;
private T payload:
private Type type;
}
Right now i am using the below code to convert
JsonNode jsonNode = objectMapper.readTree("messagefromKafka);
GenericEvent genericEvent = objectMapper.convertValue(jsonNode, new TypeReference<GenericEvent>() {});
But the code is throwing java.lang.ClassCastException: class java.util.LinkedHashMap cannot be cast to class GenericEvent .
Can someone help on this issue?
EDIT:
//Generic Object i have provided already
//Payload Object - applicable for different types - A, B, C, D
public class Payload {
private int id;
private String name;
private String address;
private String typeAAttribute1; //applicable for type A attribute
private String typeAAttribute2; //applicable for type A attribute
private String typeBAtribute1; //applicable for type B attribute
private String typeABAtribute2; //applicable for type A,B attibute
private String typeCtribute1; //applicable for type C attibute
private String typeABCAtribute1;//applicable for type A,B,C attibute
}
Kafka consumer config:
---------------------
import org.springframework.kafka.support.serializer.JsonDeserializer;
#Bean
public ConcurrentKafkaListenerContainerFactory<Object, Object> reprocessListenerContainerFactory() {
Map<String, Object> props = new HashMap<>();
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, JsonDeserializer.class);
props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, false);
props.put(ConsumerConfig.ALLOW_AUTO_CREATE_TOPICS_CONFIG, false);
props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, bootstrapservers);
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, latest);
props.put(ConsumerConfig.GROUP_ID_CONFIG, "testgroupid");
props.put(ConsumerConfig.REQUEST_TIMEOUT_MS_CONFIG, "300000");
ConcurrentKafkaListenerContainerFactory<Object, Object> factory =
new ConcurrentKafkaListenerContainerFactory<>();
factory.setConsumerFactory(new DefaultKafkaConsumerFactory<>(props));
factory.getContainerProperties().setAckMode(ContainerProperties.AckMode.MANUAL_IMMEDIATE);
factory.setRecordFilterStrategy(
(consumerRecord) -> {
try {
JsonNode jsonNode = objectMapper.readTree(consumerRecord.value().toString());
GenericEvent genericEvent = objectMapper.convertValue(jsonNode, new TypeReference<GenericEvent>() {});
log.info(
"Retrieved the record {} from the partition {} with offset {}",
consumerRecord.value(),
consumerRecord.partition(),
consumerRecord.offset());
//Process type A and B events
if (genericEvent.getType().equalIgnoreCase("A") || genericEvent.getType().equalIgnoreCase("B"))) {
return false;
}
return true;
} catch (Exception ex) {
log.error("Error occured:{}", ex.getStackTrace());
return true;
}
});
return factory;
}
//Listener
#KafkaListener(id = "MYREPROCESS-ID", topics = "reprocess-test",
containerFactory = "reprocessListenerContainerFactory",
autoStartup = "true")
public void onMessage(ConsumerRecord<String, String> consumerRecord, Acknowledgment acknowledgment) {
JsonNode jsonNode = objectMapper.readTree("messagefromKafka);
GenericEvent genericEvent = objectMapper.convertValue(jsonNode, new TypeReference<GenericEvent>() {});
//I should identify the respective payload during runtime
Payload payload = genericEvent.getPayload();
if (genericEvent.getType().equalsIgnoreCase("A") {
processPayload(payload);
} else {
processPayload(payload);
}
}
Something is odd. Since you are using the Spring JsonDeserializer, you have to tell it what to convert to; properties are documented here https://docs.spring.io/spring-kafka/docs/current/reference/html/#serdes-json-config).
In that case, you would get ConsumerRecord<?, GenericEvent>.
If you want to receive ConsumerRecord<String, String> and do the conversion yourself, you should use StringDeserializer s instead.
I am trying to save the instance of the class bellow into dynamdb but getting
DynamoDBMappingException: not supported; requires #DynamoDBTyped or #DynamoDBTypeConverted exception.
#DynamoDBTable(tableName = "FulfillmentOrders")
public class FulfillmentOrder {
#DynamoDBHashKey
private String orderId;
#DynamoDBAttribute
#DynamoDBTyped(value = DynamoDBMapperFieldModel.DynamoDBAttributeType.M)
private Map<String, Object> body;
.......
}
It fails during map conversion, seems the problem is in Object generic type.
could someone help, where is the problem here or maybe SDK doesn't support such kind of conversion ?
Thanks!
DynamoDB won't know how to convert the objects in the Map<,>, you'll have to create a custom type converter. once you've done this you can annotate the prooperty with #DynamoDBTypeConverted(converter = xxx):
In your example:
#DynamoDBTable(tableName = "FulfillmentOrders")
public class FulfillmentOrder {
#DynamoDBHashKey
private String orderId;
#DynamoDBAttribute
#DynamoDBTypeConverted(converter = BodyTypeConverter.class)
private Map<String, Object> body;
}
static public class BodyTypeConverter implements DynamoDBTypeConverter<String, Map<String, Object>> {
#Override
public String convert(Map<String, Object> object) {
DimensionType itemDimensions = (Map<String, Object>) object;
// Convert the object to a DynamoDB json string
String json = "wibble";
return json;
}
#Override
public DimensionType unconvert(String s) {
Map<String, Object> item = new Map<String, Object>();
// Convert s to a Map<String, Object> here.
return item;
}
}
More information can be found here
I am trying to write a map using com.amazonaws.services.dynamodb.datamodeling.DynamoDBMapper.save() and am getting this error:
Exception in thread "main"
com.amazonaws.services.dynamodb.datamodeling.DynamoDBMappingException:
Unsupported type: interface java.util.Map for public java.util.Map Config.getAttributes()
Is Map not supported by DynamoDBMapper?
Create a HashMapMarshaller
public class HashMapMarshaller extends JsonMarshaller<HashMap<String, String>>
{
#Override
public String marshall(HashMap<String, String> obj) {
return super.marshall(obj);
}
#Override
public HashMap<String, String> unmarshall(Class<HashMap<String, String>> clazz, String json) {
return super.unmarshall(clazz, json);
}
}
And then assign it to your property
#DynamoDBMarshalling(marshallerClass=HashMapMarshaller.class)
Swagger-Core seems to interpret the #Suspended final AsyncResponse asyncResponse member as request body param. This is clearly not intended nor the case.
I would like to tell swagger-core to ignore this parameter and to exclude it from the api-docs. Any ideas?
This is what my code looks like:
#Stateless
#Path("/coffee")
#Api(value = "/coffee", description = "The coffee service.")
public class CoffeeService
{
#Inject
Event<CoffeeRequest> coffeeRequestListeners;
#GET
#ApiOperation(value = "Get Coffee.", notes = "Get tasty coffee.")
#ApiResponses({
#ApiResponse(code = 200, message = "OK"),
#ApiResponse(code = 404, message = "Beans not found."),
#ApiResponse(code = 500, message = "Something exceptional happend.")})
#Produces("application/json")
#Asynchronous
public void makeCoffee( #Suspended final AsyncResponse asyncResponse,
#ApiParam(value = "The coffee type.", required = true)
#QueryParam("type")
String type)
{
coffeeRequestListeners.fire(new CoffeeRequest(type, asyncResponse));
}
}
Update: Solution based on Answer
public class InternalSwaggerFilter implements SwaggerSpecFilter
{
#Override
public boolean isOperationAllowed(Operation operation, ApiDescription apiDescription, Map<String, List<String>> stringListMap, Map<String, String> stringStringMap, Map<String, List<String>> stringListMap2) {
return true;
}
#Override
public boolean isParamAllowed(Parameter parameter, Operation operation, ApiDescription apiDescription, Map<String, List<String>> stringListMap, Map<String, String> stringStringMap, Map<String, List<String>> stringListMap2) {
if( parameter.paramAccess().isDefined() && parameter.paramAccess().get().equals("internal") )
return false;
return true;
}
}
FilterFactory.setFilter(new InternalSwaggerFilter());
Revised Example Code Fragment
...
#Asynchronous
public void makeCoffee( #Suspended #ApiParam(access = "internal") final AsyncResponse asyncResponse,...)
...
Fast forward to 2016 where swagger-springmvc is replaced by springfox (documentation is available here). Ignoring paramaters is available in springfox, but is for some reason not documented:
Alternative 1: Globally ignore types or annotated types with .ignoredParameterTypes(...) in Docket configuration:
#Bean
public Docket api() {
return new Docket(DocumentationType.SWAGGER_2)
.host(reverseProxyHost)
.useDefaultResponseMessages(false)
.directModelSubstitute(OffsetDateTime.class, String.class)
.directModelSubstitute(Duration.class, String.class)
.directModelSubstitute(LocalDate.class, String.class)
.forCodeGeneration(true)
.globalResponseMessage(RequestMethod.GET, newArrayList(
new ResponseMessageBuilder()
.code(200).message("Success").build()
)
.apiInfo(myApiInfo())
.ignoredParameterTypes(AuthenticationPrincipal.class, Predicate.class, PathVariable.class)
.select()
.apis(withClassAnnotation(Api.class))
.paths(any())
.build();
}
Alternativ 2: Use #ApiIgnore-annotation to ignore single parameter in method:
#ApiOperation(value = "User details")
#RequestMapping(value = "/api/user", method = GET, produces = APPLICATION_JSON_UTF8_VALUE)
public MyUser getUser(#ApiIgnore #AuthenticationPrincipal MyUser user) {
...
}
I solved this issue with the same technique that you use, but with a different approach. Instead of marking it as internal I just ignore all the params with the type AsyncResponse, that way I don't need to update the code in all methods to add the access modifier.
public class CustomSwaggerSpecFilter implements SwaggerSpecFilter {
#Override
public boolean isOperationAllowed(Operation operation, ApiDescription api, Map<String, List<String>> params, Map<String, String> cookies,
Map<String, List<String>> headers) {
return true;
}
#Override
public boolean isParamAllowed(Parameter parameter, Operation operation, ApiDescription api, Map<String, List<String>> params,
Map<String, String> cookies, Map<String, List<String>> headers) {
if(parameter.dataType().equals("AsyncResponse")) { // ignoring AsyncResponse parameters
return false;
}
return true;
}
}
That works better for me.
I think you have to use filters.
Here is an example https://github.com/wordnik/swagger-core/issues/269
Could be coded in java too.
Another way might be to do this.
#Bean
public SwaggerSpringMvcPlugin mvPluginOverride() {
SwaggerSpringMvcPlugin swaggerSpringMvcPlugin = new SwaggerSpringMvcPlugin(this.springSwaggerConfig).apiInfo(apiInfo());
swaggerSpringMvcPlugin.ignoredParameterTypes(PagedResourcesAssembler.class, Pageable.class);
return swaggerSpringMvcPlugin;
}