How can i capture record key and value when there is a DeserializationException while consuming a message from kafka topic? - spring-kafka

I'm using spring boot 2.1.7.RELEASE and spring-kafka 2.2.8.RELEASE.And I'm using #KafkaListener annotation to create a consumer and I'm using all default settings for the consumer.And I'm using below configuration as specified in the Spring-Kafka documentation.
// other props
props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, ErrorHandlingDeserializer2.class);
props.put(ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS, StringDeserializer.class);
props.put(ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS, AvroDeserializer.class.getName());
return new DefaultKafkaConsumerFactory<>(props);
Now, I've implemented my custom SeekToCurrentErrorHandler by extending SeekToCurrentErrorHandler as per the below thread but the record value is coming as null and the record key is not in a readable format. Please suggest me how can i get the record key and value?
How to capture the exception and message key when using ErrorHandlingDeserializer2 to handle exceptions during deserialization
Here is my custom SeekToCurrentErrorHandler code
#Component
public class MySeekToCurrentErrorHandler extends SeekToCurrentErrorHandler {
private final MyDeadLetterRecoverer deadLetterRecoverer;
#Autowired
public MySeekToCurrentErrorHandler(MyDeadLetterRecoverer deadLetterRecoverer) {
super(-1);
this.deadLetterRecoverer = deadLetterRecoverer;
}
#Override
public void handle(Exception thrownException, List<ConsumerRecord<?, ?>> data, Consumer<?, ?> consumer, MessageListenerContainer container) {
if (thrownException instanceof DeserializationException) {
//Improve to support multiple records
DeserializationException deserializationException = (DeserializationException) thrownException;
deadLetterRecoverer.accept(data.get(0), deserializationException);
ConsumerRecord<?, ?>. consumerRecord = data.get(0);
sout(consumerRecord.key());
sout(consumerRecord.value());
} else {
//Calling super method to let the 'SeekToCurrentErrorHandler' do what it is actually designed for
super.handle(thrownException, data, consumer, container);
}
}
}

If the key fails deserialization, the original byte[] can be obtained by calling getData() on the exception.
Similarly, if the value fails deserialization, use getData() to get the original data.
The DeadLetterPublishingRecoverer does this (since 2.3).
You can tell which of the key or value failed by calling isKey() on the exception.
EDIT
I was wrong, the key and value are available if the value or key failed.
This is written with Boot 2.3.4:
#SpringBootApplication
public class Application {
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
}
#Bean
SeekToCurrentErrorHandler errorHandler(ProducerFactory<String, String> pf) {
Map<String, Object> configs = new HashMap<>(pf.getConfigurationProperties());
configs.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class);
configs.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, ByteArraySerializer.class);
ProducerFactory<byte[], byte[]> bytesPF = new DefaultKafkaProducerFactory<>(configs);
KafkaOperations<byte[], byte[]> template = new KafkaTemplate<>(bytesPF);
return new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template),
new FixedBackOff(1000, 5));
}
#KafkaListener(id = "so64597061", topics = "so64597061",
properties = {
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG
+ ":org.springframework.kafka.support.serializer.ErrorHandlingDeserializer",
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG
+ ":org.springframework.kafka.support.serializer.ErrorHandlingDeserializer",
ErrorHandlingDeserializer.KEY_DESERIALIZER_CLASS
+ ":com.example.demo.Application$FailSometimesDeserializer",
ErrorHandlingDeserializer.VALUE_DESERIALIZER_CLASS
+ ":com.example.demo.Application$FailSometimesDeserializer"
})
public void listen(String val, #Header(name = KafkaHeaders.RECEIVED_MESSAGE_KEY) String key) {
System.out.println(key + ":" + val);
}
#KafkaListener(id = "so64597061.dlt", topics = "so64597061.DLT",
properties = {
ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG
+ ":org.apache.kafka.common.serialization.ByteArrayDeserializer",
ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG
+ ":org.apache.kafka.common.serialization.ByteArrayDeserializer"
})
public void dltListen(byte[] val, #Header(name = KafkaHeaders.RECEIVED_MESSAGE_KEY, required = false) byte[] key) {
String keyStr = key != null ? new String(key) : null;
String valStr = val != null ? new String(val) : null;
System.out.println("DLT:" + keyStr + ":" + valStr);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> {
template.send("so64597061", "foo", "bar");
template.send("so64597061", "fail", "keyFailed");
template.send("so64597061", "valueFailed", "fail");
};
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so64597061").partitions(1).replicas(1).build();
}
#Bean
public NewTopic dlt() {
return TopicBuilder.name("so64597061.DLT").partitions(1).replicas(1).build();
}
public static class FailSometimesDeserializer implements Deserializer<byte[]> {
#Override
public void configure(Map<String, ?> configs, boolean isKey) {
}
#Override
public byte[] deserialize(String topic, byte[] data) {
return data;
}
#Override
public void close() {
}
#Override
public byte[] deserialize(String topic, Headers headers, byte[] data) {
String string = new String(data);
if ("fail".equals(string)) {
throw new RuntimeException("fail");
}
return data;
}
}
}
spring.kafka.consumer.auto-offset-reset=earliest
foo:bar
DLT:fail:keyFailed
DLT:valueFailed:fail

Related

processing strategy of message in spring kafka listener

Just wanted to make sure that whether messages are processed in correct way or not. When the message gets received at listener, it will be always processed by a new thread( defined the processor bean as prototype). is this implementation correct ? (i have Considered the listener is not thread safe, so for this reason the prototype scope of bean to process the message has been used)
(Input : TestTopic- 5 partitions - 1 consumer) or (Input : TestTopic- 5 partitions - 5 consumers)
public class EventListener {
#Autowired
private EventProcessor eventProcessor;
#KafkaListener(topics = "TestTopic", containerFactory = "kafkaListenerContainerFactory",
autoStartup = "true")
public void onMessage(
#Payload List<ConsumerRecord<String, String>> consumerRecords, Acknowledgment acknowledgment) {
eventProcessor.processAndAcknowledgeBatchMessages(consumerRecords, acknowledgment);
}
}
//event processor
#Slf4j
#Component
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
#NoArgsConstructor
#SuppressWarnings("unused")
public class EventProcessorImpl implements EventProcessor {
#Autowired
private KafkaProducerTemplate kafkaProducerTemplate;
#Autowired
private ObjectMapper localObjectMapper;
#Autowired
private Dao dao;
public void processAndAcknowledgeBatchMessages(
List<ConsumerRecord<String, String>> consumerRecords, Acknowledgment acknowledgment) {
long start = System.currentTimeMillis();
consumerRecords.forEach( consumerRecord -> {
Event event = localObjectMapper.readValue(consumerRecord.value(), Event.class);
dao.save(process(event));
});
acknowledgment.acknowledge();
}
}
No it is not correct; you should not execute on another thread; it will cause problems with committing offsets and error handling.
Also, making the EventProcessorImpl a prototype bean won't help. That just means a new instance is used each time the bean is referenced.
Since it is #Autowired it is only referenced once, during initialization. To get a new instance for each request, you would need to call getBean() on the application context each time.
It is better to make your code thread-safe.
EDIT
There are (at least) a couple of ways to deal with a not thread-safe service defined in prototype scope.
Use a ThreadLocal:
#SpringBootApplication
public class So68447863Application {
public static void main(String[] args) {
SpringApplication.run(So68447863Application.class, args);
}
private static final ThreadLocal<NotThreadSafeService> SERVICES = new ThreadLocal<>();
#Autowired
ApplicationContext context;
#KafkaListener(id = "so68447863", topics = "so68447863", concurrency = "5")
void listen(String in) {
NotThreadSafeService service = SERVICES.get();
if (service == null) {
service = this.context.getBean(NotThreadSafeService.class);
SERVICES.set(service);
}
service.process(in);
}
#EventListener
void removeService(ConsumerStoppedEvent event) {
System.out.println("Consumer stopped; removing TL");
SERVICES.remove();
}
#Bean
NewTopic topic() {
return TopicBuilder.name("so68447863").partitions(10).replicas(1).build();
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
NotThreadSafeService service() {
return new NotThreadSafeService();
}
}
class NotThreadSafeService {
void process(String msg) {
System.out.println(msg + " processed by " + this);
}
}
Use a pool of instances.
#SpringBootApplication
public class So68447863Application {
public static void main(String[] args) {
SpringApplication.run(So68447863Application.class, args);
}
private static final BlockingQueue<NotThreadSafeService> SERVICES = new LinkedBlockingQueue<>();
#Autowired
ApplicationContext context;
#KafkaListener(id = "so68447863", topics = "so68447863", concurrency = "5")
void listen(String in) {
NotThreadSafeService service = SERVICES.poll();
if (service == null) {
service = this.context.getBean(NotThreadSafeService.class);
}
try {
service.process(in);
}
finally {
SERVICES.add(service);
}
}
#Bean
NewTopic topic() {
return TopicBuilder.name("so68447863").partitions(10).replicas(1).build();
}
#Bean
#Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)
NotThreadSafeService service() {
return new NotThreadSafeService();
}
}
class NotThreadSafeService {
void process(String msg) {
System.out.println(msg + " processed by " + this);
}
}

Issue with Record Filter Strategy(Spring boot : 2.3.8). Filtered messages are coming again and again to the filter

I am working on the spring kafka batch listener filter strategy. I am facing an issue that, the filtered events are coming again and again. could any one help me on this issue ? spring boot with kafka version(2.3.8)
Here is my configuration:
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new
ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setBatchListener(true);
factory.setAckDiscarded(true);
factory.getContainerProperties().setIdleBetweenPolls(30000);
factory.setRecordFilterStrategy(
(consumerRecord) -> {
MyObject myObject = new ObjectMapper().readValue(consumerRecord.value(), MyObj.class);
if (myObject.frequency > 10) {
return false;
} else {
return true;
}});
factory.setBatchErrorHandler(new SeekToCurrentBatchErrorHandler());
When using batch mode with MANUAL acks, if you filter all the records (discard them all), the listener will get an empty list so you can still acknowledge the batch to commit the offsets.
I just tested it and it works as expected.
#SpringBootApplication
public class So67259790Application {
public static void main(String[] args) {
SpringApplication.run(So67259790Application.class, args);
}
#KafkaListener(id = "so67259790", topics = "so67259790")
public void listen(List<String> in, Acknowledgment ack) {
System.out.println(in);
ack.acknowledge();
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so67259790").partitions(1).replicas(1).build();
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, String> template) {
return args -> {
template.send("so67259790", "foo");
template.send("so67259790", "bar");
};
}
#Bean
public RecordFilterStrategy<Object, Object> rfs() {
return rec -> true;
}
}

Query with DynamoDB Secondary Index AWS SDK 2 Java exception creating DynamoDbIndex object

I'm having trouble running a query against a secondary index, getting an exception:
Ex getting dynamodb scan: java.lang.IllegalArgumentException: Attempt to execute an operation that requires a secondary index without defining the index attributes in the table metadata. Index name: category-timestamp-index
Can someone guide me on how I'm doing this wrong?
My table is idIT_RSS_Sources and I've created an index category-timestamp-index.
screenshot attached of index
My code is:
DynamoDbEnhancedClient enhancedClient = getEnhancedDBClient(region);
// Create a DynamoDbTable object
logger.debug("getting RSS Source category-timestamp-index");
//this throws the exception
DynamoDbIndex<RSS_Source> catIndex =
enhancedClient.table("idIT_RSS_Sources",
TableSchema.fromBean(RSS_Source.class))
.index("category-timestamp-index");
logger.debug("building query attributes");
AttributeValue att = AttributeValue.builder()
.s(theCategory)
.build();
Map<String, AttributeValue> expressionValues = new HashMap<>();
expressionValues.put(":value", att);
Expression expression = Expression.builder()
.expression("category = :value")
.expressionValues(expressionValues)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(theCategory)
.build());
logger.debug("calling catIndex.query in getRSS...ForCategory");
Iterator<Page<RSS_Source>> dbFeedResults = (Iterator<Page<RSS_Source>>) catIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
solved, I was not using the proper annotation in my model class:
#DynamoDbSecondaryPartitionKey(indexNames = { "category-index" })
public String getCategory() { return category; }
public void setCategory(String category) { this.category = category; }
Assume you have a model named Issues.
package com.example.dynamodb;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbBean;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSecondaryPartitionKey;
import software.amazon.awssdk.enhanced.dynamodb.mapper.annotations.DynamoDbSortKey;
#DynamoDbBean
public class Issues {
private String issueId;
private String title;
private String createDate;
private String description;
private String dueDate;
private String status;
private String priority;
private String lastUpdateDate;
#DynamoDbPartitionKey
public String getId() {
return this.issueId;
}
public void setId(String id) {
this.issueId = id;
}
#DynamoDbSortKey
public String getTitle() {
return this.title;
}
public void setTitle(String title) {
this.title = title;
}
public void setLastUpdateDate(String lastUpdateDate) {
this.lastUpdateDate = lastUpdateDate;
}
public String getLastUpdateDate() {
return this.lastUpdateDate;
}
public void setPriority(String priority) {
this.priority = priority;
}
public String getPriority() {
return this.priority;
}
public void setStatus(String status) {
this.status = status;
}
public String getStatus() {
return this.status;
}
public void setDueDate(String dueDate) {
this.dueDate = dueDate;
}
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
public String getDate() {
return this.createDate;
}
public void setDate(String date) {
this.createDate = date;
}
public String getDescription() {
return this.description;
}
public void setDescription(String description) {
this.description = description;
}
}
Notice the annotation on getDueDate.
#DynamoDbSecondaryPartitionKey(indexNames = { "dueDateIndex" })
public String getDueDate() {
return this.dueDate;
}
This is because the Issues table has a secondary index named dueDateIndex.
To query on this secondary index, you can use this code that uses the Amazon DynamoDB Java API V2:
public static void queryIndex(DynamoDbClient ddb, String tableName, String indexName) {
try {
// Create a DynamoDbEnhancedClient and use the DynamoDbClient object
DynamoDbEnhancedClient enhancedClient = DynamoDbEnhancedClient.builder()
.dynamoDbClient(ddb)
.build();
//Create a DynamoDbTable object based on Issues
DynamoDbTable<Issues> table = enhancedClient.table("Issues", TableSchema.fromBean(Issues.class));
String dateVal = "2013-11-19";
DynamoDbIndex<Issues> secIndex =
enhancedClient.table("Issues",
TableSchema.fromBean(Issues.class))
.index("dueDateIndex");
AttributeValue attVal = AttributeValue.builder()
.s(dateVal)
.build();
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder().partitionValue(attVal)
.build());
// Get items in the Issues table
SdkIterable<Page<Issues>> results = secIndex.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
AtomicInteger atomicInteger = new AtomicInteger();
atomicInteger.set(0);
results.forEach(page -> {
Issues issue = (Issues) page.items().get(atomicInteger.get());
System.out.println("The issue title is "+issue.getTitle());
atomicInteger.incrementAndGet();
});
} catch (DynamoDbException e) {
System.err.println(e.getMessage());
System.exit(1);
}
}
For what it's worth, if your Global Secondary Index has a sort key, you must annotate that field in the DynamoDB bean with:
#DynamoDbSecondarySortKey(indexNames = { "<indexName>" })
public String getFieldName() {
return fieldName;
}
My working code is as below:
sortKey-index = GSI in dynamo db
List<Flow> flows = new ArrayList<>();
DynamoDbIndex<Flow> flowBySortKey = table().index("sortKey-index");
// Create a QueryConditional object that's used in the query operation
QueryConditional queryConditional = QueryConditional
.keyEqualTo(Key.builder()
.partitionValue(sortKey)
.build());
SdkIterable<Page<Flow>> dbFeedResults = flowBySortKey.query(
QueryEnhancedRequest.builder()
.queryConditional(queryConditional)
.build());
dbFeedResults.forEach(flowPage -> {
flows.addAll(flowPage.items());
});

SeekToCurrentErrorHandler: DeadLetterPublishingRecoverer is not handling deserialize errors

I am trying to write kafka consumer using spring-kafka version 2.3.0.M2 library.
To handle run time errors I am using SeekToCurrentErrorHandler.class with DeadLetterPublishingRecoverer as my recoverer. This works fine only when my consumer code throws exception, but fails when unable to deserialize the message.
I tried implementing ErrorHandler myself and I was successful but with this approach I myself end up writing DLT code to handle error messages which I do not want to do.
Below are my kafka properties
spring:
kafka:
consumer:
bootstrap-servers: localhost:9092
group-id: group_id
auto-offset-reset: latest
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
properties:
spring.json.trusted.packages: com.mypackage
spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
spring.deserializer.value.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
public ConcurrentKafkaListenerContainerFactory kafkaListenerContainerFactory(
ConcurrentKafkaListenerContainerFactoryConfigurer configurer,
ConsumerFactory<Object, Object> kafkaConsumerFactory,
KafkaTemplate<Object, Object> template) {
ConcurrentKafkaListenerContainerFactory<Object, Object> factory = new ConcurrentKafkaListenerContainerFactory<>();
configurer.configure(factory, kafkaConsumerFactory);
factory.setErrorHandler(new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template), maxFailures));}
It works fine for me (note that Boot will auto-configure the error handler)...
#SpringBootApplication
public class So56728833Application {
public static void main(String[] args) {
SpringApplication.run(So56728833Application.class, args);
}
#Bean
public SeekToCurrentErrorHandler errorHandler(KafkaTemplate<String, String> template) {
SeekToCurrentErrorHandler eh = new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template), 3);
eh.setClassifier( // retry for all except deserialization exceptions
new BinaryExceptionClassifier(Collections.singletonList(DeserializationException.class), false));
return eh;
}
#KafkaListener(id = "so56728833"
+ "", topics = "so56728833")
public void listen(Foo in) {
System.out.println(in);
if (in.getBar().equals("baz")) {
throw new IllegalStateException("Test retries");
}
}
#KafkaListener(id = "so56728833dlt", topics = "so56728833.DLT")
public void listenDLT(Object in) {
System.out.println("Received from DLT: " + (in instanceof byte[] ? new String((byte[]) in) : in));
}
#Bean
public NewTopic topic() {
return TopicBuilder.name("so56728833").partitions(1).replicas(1).build();
}
#Bean
public NewTopic dlt() {
return TopicBuilder.name("so56728833.DLT").partitions(1).replicas(1).build();
}
public static class Foo {
private String bar;
public Foo() {
super();
}
public Foo(String bar) {
this.bar = bar;
}
public String getBar() {
return this.bar;
}
public void setBar(String bar) {
this.bar = bar;
}
#Override
public String toString() {
return "Foo [bar=" + this.bar + "]";
}
}
}
spring:
kafka:
consumer:
auto-offset-reset: earliest
enable-auto-commit: false
key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer2
properties:
spring.json.trusted.packages: com.example
spring.deserializer.key.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
spring.json.value.default.type: com.example.So56728833Application$Foo
producer:
key-serializer: org.springframework.kafka.support.serializer.JsonSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
logging:
level:
org.springframework.kafka: trace
I have 3 records in the topic:
"badJSON"
"{\"bar\":\"baz\"}"
"{\"bar\":\"qux\"}"
I see the first one going directly to the DLT, and the second one goes there after 3 attempts.

Is there a way to make Spring Thymeleaf process a string template?

I would like to write something like :
#Autowired
private SpringTemplateEngine engine;
....
// Thymeleaf Context
WebContext thymeleafContext = new WebContext(request, response, request.getServletContext(), locale);
// cached html of a thymeleaf template file
String cachedHtml=....
// process the cached html
String html=engine.process(cachedHtml, thymeleafContext);
By default, the [process] method can't do that. I can understand from the docs that I need a special Template Resolver :
In order to execute templates, the process(String, IContext) method will be used:
final String result = templateEngine.process("mytemplate", ctx);
The "mytemplate" String argument is the template name, and it will relate to the physical/logical location of the template itself in a way configured at the template resolver/s.
Does anyone know how to solve my problem ?
The goal is to cache the Thymeleaf templates (files) in strings and then process theses strings rather than the files.
The solution we ended up using consisted of a new IResourceResolver with a custom Context rather than a custom TemplateResolver. We chose this because we still wanted to use classpath scanning in most cases, but occasionally had dynamic content.
The following shows how we did it:
public class StringAndClassLoaderResourceResolver implements IResourceResolver {
public StringAndClassLoaderResourceResolver() {
super();
}
public String getName() {
return getClass().getName().toUpperCase();
}
public InputStream getResourceAsStream(final TemplateProcessingParameters params, final String resourceName) {
Validate.notNull(resourceName, "Resource name cannot be null");
if( StringContext.class.isAssignableFrom( params.getContext().getClass() ) ){
String content = ((StringContext)params.getContext()).getContent();
return IOUtils.toInputStream(content);
}
return ClassLoaderUtils.getClassLoader(ClassLoaderResourceResolver.class).getResourceAsStream(resourceName);
}
public static class StringContext extends Context{
private final String content;
public StringContext(String content) {
this.content = content;
}
public StringContext(String content, Locale locale) {
super(locale);
this.content = content;
}
public StringContext(String content, Locale locale, Map<String, ?> variables) {
super(locale, variables);
this.content = content;
}
public String getContent() {
return content;
}
}
Test Case
public class StringAndClassLoaderResourceResolverTest {
private static SpringTemplateEngine templateEngine;
#BeforeClass
public static void setup(){
TemplateResolver resolver = new TemplateResolver();
resolver.setResourceResolver(new StringAndClassLoaderResourceResolver());
resolver.setPrefix("mail/"); // src/test/resources/mail
resolver.setSuffix(".html");
resolver.setTemplateMode("LEGACYHTML5");
resolver.setCharacterEncoding(CharEncoding.UTF_8);
resolver.setOrder(1);
templateEngine = new SpringTemplateEngine();
templateEngine.setTemplateResolver(resolver);
}
#Test
public void testStringResolution() {
String expected = "<div>dave</div>";
String input = "<div th:text=\"${userName}\">Some Username Here!</div>";
IContext context = new StringAndClassLoaderResourceResolver.StringContext(input);
context.getVariables().put("userName", "dave");
String actual = templateEngine.process("redundant", context);
assertEquals(expected, actual);
}
#Test
public void testClasspathResolution(){
IContext context = new Context();
context.getVariables().put("message", "Hello Thymeleaf!");
String actual = templateEngine.process("dummy", context);
String expected = "<h1>Hello Thymeleaf!</h1>";
assertEquals(expected, actual);
}
}
Dummy template file at src/main/resources/mail/dummy.html
<h1 th:text="${message}">A message will go here!</h1>
Note: We used Apache CommonsIO's IOUtils for converting the String to an InputStream
You can implement your own TemplateResolver and IResourceResolver to work with String.
for simple unit tests:
static class TestResourceResolver implements IResourceResolver {
public String content = "";
#Override
public String getName() {
return "TestTemplateResolver";
}
#Override
public InputStream getResourceAsStream(TemplateProcessingParameters templateProcessingParameters,
String resourceName) {
return new ByteArrayInputStream(content.getBytes());
}
}
or just use org.thymeleaf.templateresolver.StringTemplateResolver in Thymeleaf 3
Yep StringTemplateResolver is the way to go.
public class ReportTemplateEngine {
private static TemplateEngine instance;
private ReportTemplateEngine() {}
public static TemplateEngine getInstance() {
if(instance == null){
synchronized (ReportTemplateEngine.class) {
if(instance == null) {
instance = new TemplateEngine();
StringTemplateResolver templateResolver = new StringTemplateResolver();
templateResolver.setTemplateMode(TemplateMode.HTML);
instance.setTemplateResolver(templateResolver);
}
}
}
return instance;
}
}

Resources