Micronaut multiple JDBC templates not working - jdbctemplate

I am trying to write a micronaut function which is deploying as AWS Lambda.
With my micronaut function, I need to connect to multiple databases and get the data and put details into AWS SQS. In this regard, I am trying to use JDBC template approach to get data from different data sources. But I am getting error: Multiple possible bean candidates found: [org.springframework.jdbc.core.JdbcTemplate, org.springframework.jdbc.core.JdbcTemplate, org.springframework.jdbc.core.JdbcTemplate] error
package io.test.invoice;
import io.micronaut.context.annotation.Factory;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.inject.Named;
import javax.inject.Singleton;
import javax.sql.DataSource;
#Factory
public class JdbcTemplateFactory {
#Singleton
JdbcTemplate jdbcTemplateOne(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
#Singleton
JdbcTemplate jdbcTemplateTwo(#Named(value = "database2") DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
}
package io.test.invoice;
import io.micronaut.context.annotation.Requires;
import org.springframework.jdbc.core.BeanPropertyRowMapper;
import org.springframework.jdbc.core.JdbcTemplate;
import javax.inject.Singleton;
import java.util.List;
#Singleton
#Requires(beans = JdbcTemplate.class)
public class CodeSetRepository {
private final JdbcTemplate jdbcTemplateOne;
private final JdbcTemplate jdbcTemplateTwo;
public CodeSetRepository(JdbcTemplate jdbcTemplateOne, JdbcTemplate jdbcTemplateTwo) {
this.jdbcTemplateOne = jdbcTemplateOne;
this.jdbcTemplateTwo = jdbcTemplateTwo;
}
public List<CodeSet> getAllCodeSets() {
String SELECT_QUERY = "SELECT * FROM public.code_set";
return this.jdbcTemplateTwo.query(SELECT_QUERY, new BeanPropertyRowMapper(CodeSet.class));
}
public List<Country> getAllCountries() {
String SELECT_QUERY = "SELECT * FROM public.country";
return this.jdbcTemplateOne.query(SELECT_QUERY, new BeanPropertyRowMapper(Country.class));
}
}
Could anyone help with this please?

The name of the parameter jdbcTemplateOne has no bearing on the injection. So both parameters are asking for the same thing. There are multiple templates thus Micronaut doesn't know which one to inject.
In your factory you can create a template for each datasource with
#EachBean(DataSource.class)
JdbcTemplate jdbcTemplateOne(DataSource dataSource) {
return new JdbcTemplate(dataSource);
}
Then the named qualifier of the datasource will transfer to the template. That means in your example you could inject #Named("database2") JdbcTemplate jdbcTemplate.
Alternatively you can add #Named qualifiers to the factory methods and then inject the jdbc templates with those qualifiers.

Change your repository constructor like below
#Inject
public CodeSetRepository(#Named("database2") JdbcTemplate jdbcTemplateOne) {
this.jdbcTemplateOne = jdbcTemplateOne;
}

Related

Why do I get an "Ambiguous dependencies for interface" Exception when I'm already uses the #Produces annotation?

I'm using two Messaging Oriented Middleware in my project. RabbitMQ and Apache Kafka. I have an consumer interface IConsume which are implemented by ConsumerRabbitMQ and ConsumerKafka. At startup going through some conditions I use the #Produces annotation to choose an implementation for the Interface Bean that I will inject, but it gives me this error.
Exception 1:
org.jboss.weld.exceptions.DeploymentException: WELD-001409: Ambiguous dependencies for type IConsume with qualifiers #Default
at injection point [BackedAnnotatedField] #Inject private com.mycompany.chatapp.startup.RunConsumers.ct
at com.mycompany.chatapp.startup.RunConsumers.ct(RunConsumers.java:0)
Possible dependencies:
- Session bean [class com.mycompany.chatapp.messagegateway.ConsumerRabbitMQ with qualifiers [#Any #Default]; local interfaces are [IConsume],
- Producer Method [IConsume] with qualifiers [#Any #Default] declared as [[BackedAnnotatedMethod] #Produces public com.mycompany.chatapp.startup.MOMConfigBean.produceIConsume()],
- Session bean [class com.mycompany.chatapp.messagegateway.ConsumerKafka with qualifiers [#Any #Default]; local interfaces are [IConsume]
#Default and #Alternative works, but I want it to choose by checking which of the Middleware is running.
The lookup works, I also tried beanName. I think the problem is with the #Produces, but I can't find to seem what.
import javax.enterprise.inject.Produces;
#Singleton
#Startup
public class MOMConfigBean {
private String mom;
#PostConstruct
public void init() {
mom = "Kafka";
}
#EJB(lookup = "java:global/Chatapp/ConsumerKafka!com.mycompany.chatapp.messagegateway.IConsume")
IConsume kafkaConsumer;
#EJB(lookup = "java:global/Chatapp/ConsumerRabbitMQ!com.mycompany.chatapp.messagegateway.IConsume")
IConsume rabbitConsumer;
#Produces
public IConsume produceIConsume() {
if ("Kafka".equals(mom)) {
return kafkaConsumer;
} else {
return rabbitConsumer;
}
}
public interface IConsume {
// some code
}
#Stateless
public class ConsumerKafka implements IConsume{
// some code
}
#Stateless
public class ConsumerRabbitMQ implements IConsume {
// some code
}
public class runConsumers{
#Inject
private IConsume ct;
}
You have three ambiguous sources of IConsume instances:
a ConsumerKafka EJB
a ConsumerRabbitMQ EJB
an #Produces public IConsume produceIConsume() method.
You need to disambiguate the source of the IConsume instances using a qualifier.
This qualifier would look something like:
import static java.lang.annotation.ElementType.FIELD;
import static java.lang.annotation.ElementType.METHOD;
import static java.lang.annotation.RetentionPolicy.RUNTIME;
import java.lang.annotation.Retention;
import java.lang.annotation.Target;
import javax.inject.Qualifier;
#Qualifier
#Retention(RUNTIME)
#Target({METHOD, FIELD})
public #interface ConditionalMom {
}
Then qualify the producer:
#Produces
#ConditionalMom
public IConsume produceIConsume() {
if ("Kafka".equals(mom)) {
return kafkaConsumer;
} else {
return rabbitConsumer;
}
}
and the injection site:
public class runConsumers{
#Inject
#ConditionalMom
private IConsume ct;
}
Now you have a single source of #ConditionalMom IConsume instances so it is no longer ambiguous.
You will find that you will be using qualifiers all over the place as you start to further exploit CDI features.

Map<string,string> variable is not getting added in DynamoDB Local but works well in Real DynamoDB

I am trying to add Map variable using Spring for DynamoDb plugin into DynamoDbLocal database .... It doesnt give any error when I write but when I read the same variable I get a null value against the map request.
This datatype is by default managed by DynamoDbMapper I believe. Do you have any idea what wrong or what needs to be done ?
Here's my domain code
package com.wt.domain;
import java.io.Serializable;
import java.util.HashMap;
import java.util.Map;
import org.springframework.data.annotation.Id;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBIgnore;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBRangeKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTable;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
#DynamoDBTable(tableName = "Notification")
#AllArgsConstructor
#NoArgsConstructor
public class Notification implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#DynamoDBIgnore
private NotificationCompositeKey notificationCompositeKey = new NotificationCompositeKey();
#Getter
#Setter
#DynamoDBAttribute // This is the Map - Giving Null when read
private Map<String,String> jsonMessage = new HashMap<String, String>();
#DynamoDBHashKey(attributeName = "identityId")
public String getIdentityId() {
return notificationCompositeKey.getIdentityId();
}
public void setIdentityId(String identityId) {
notificationCompositeKey.setIdentityId(identityId);
}
public void setTimestamp(long timestamp) {
notificationCompositeKey.setTimestamp(timestamp);
}
#DynamoDBRangeKey(attributeName = "notificationTimestamp")
public long getTimestamp() {
return notificationCompositeKey.getTimestamp();
}
}
Firstly, I presume that you have tested the code connecting to real DynamoDB on some deployed environment rather than on local.
I think, the problem is not on local DynamoDB. The problem should be the way you create getter and setter using lombok API. If you remove the lombok getter and setter annotation and add the getter/setter manually. It should work on local Dynamodb as well. I have seen this behavior.
#DynamoDBAttribute(attributeName = "jsonMessage")
public Map<String, String> getJsonMessage() {
return jsonMessage;
}
public void setJsonMessage(Map<String, String> jsonMessage) {
this.jsonMessage = jsonMessage;
}
Thanx guys - The issue got resolved and the problem was that I kept a condition if the value is null then insert blank string for that key-value in the map.
Even as per the AWS Documentation - it wont allow the Map or the List Datatype to be entered if the value is null or blank.

Error creating a base repository class

import org.springframework.data.jpa.repository.support.JpaEntityInformation;
import org.springframework.data.jpa.repository.support.QueryDslJpaRepository;
import org.springframework.data.querydsl.EntityPathResolver;
import org.springframework.data.repository.NoRepositoryBean;
import org.springframework.transaction.annotation.Transactional;
import javax.persistence.EntityManager;
#NoRepositoryBean
public class RepositorySupportImpl<T> extends QueryDslJpaRepository<T, Integer> implements RepositorySupport<T> {
private EntityManager entityManager;
public RepositorySupportImpl(JpaEntityInformation<T, Integer> entityInformation, EntityManager entityManager, EntityManager entityManager1) {
super(entityInformation, entityManager);
this.entityManager = entityManager1;
}
public RepositorySupportImpl(JpaEntityInformation<T, Integer> entityInformation, EntityManager entityManager, EntityPathResolver resolver, EntityManager entityManager1) {
super(entityInformation, entityManager, resolver);
this.entityManager = entityManager1;
}
#Override
public EntityManager getEntityManager() {
return this.entityManager;
}
#Transactional
#Override
public <S extends T> S save(final S entity) {
this.getEntityManager().persist(entity);
return entity;
}
}
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.data.querydsl.QueryDslPredicateExecutor;
import javax.persistence.EntityManager;
public interface RepositorySupport<T> extends JpaRepository<T, Integer>, QueryDslPredicateExecutor<T> {
EntityManager getEntityManager();
}
in my config class i have repositoryBaseClass = RepositorySupportImpl.class
But I get this error:
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'registerService': Unsatisfied dependency expressed through field 'userRepository'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'userRepository': Invocation of init method failed; nested exception is java.lang.IllegalStateException: No suitable constructor found on class ca.lighthouse.repository.RepositorySupportImpl to match the given arguments: [class org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation, class com.sun.proxy.$Proxy52]. Make sure you implement a constructor taking these
I had the same problem just now and after some debugging I figured out the solution:
Make sure you only have one EnableJpaRepositories annotation and that it's pointing to the implementation class (not to the interface):
#EnableJpaRepositories(repositoryBaseClass = GenericJpaRepositoryImpl.class)
I hope it helps someone in the future ;)
I already had the same problem, and the solution was to create the right constructor for implementing the repository.
public RepositorySupportImpl(JpaEntityInformation<T, ?> entityInformation, EntityManager entityManager) {
super(entityInformation, entityManager);
}
You have to use both constructors in the Impl class:
public GenericRepositoryImpl(JpaEntityInformation<T, ?> entityInformation, EntityManager entityManager) {
super(entityInformation, entityManager);
this.em = entityManager;
domainClass = null;
}
public GenericRepositoryImpl(Class<T> domainClass, EntityManager entityManager) {
super(domainClass, entityManager);
this.em = entityManager;
this.domainClass = domainClass;
}
Thanks for all your comments, its working now. I wish I could definitively pin point the problem, but I can't. Here is the scenario, I have a spring mvc application, with controllers, entities, business services, jsps, and I wanted to clean things up so I decided to break the project down into modules. So I created a new project, added the modules and then copied the files from the old project into the new project. When I copied the RepositorySupport class and interface I thought I should rename it, to what you see above. That resulted in this error and after a few days of researching and trying different things I decided to copy in the original files and it worked. That's all I did, copy in the files and updated the references.

Spring Boot Hibernate RestFull Service PostgreSQL

I'm new user of Spring and I want develop a RestFull Service with Hibernate-PostGreSQL and Spring Boot. I try to learn with the documentation of Spring but I have lot of problem to deploy a simple service.
I don't use XML file properties but a Java Class.
Here are my different files :
PersistanceJPAConfig.java :
package com.spring.configuration;
import javax.sql.DataSource;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import org.springframework.jdbc.datasource.DriverManagerDataSource;
import org.springframework.orm.jpa.JpaTransactionManager;
import org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean;
import org.springframework.orm.jpa.vendor.Database;
import org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter;
#Configuration
public class PersistenceJPAConfig {
#Bean
public DataSource dataSource() {
DriverManagerDataSource driver = new DriverManagerDataSource();
driver.setDriverClassName("org.postgresql.Driver");
driver.setUrl("jdbc:postgresql://localhost:5432/test");
driver.setUsername("test");
driver.setPassword("test");
return driver;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
HibernateJpaVendorAdapter vendorAdapter = new HibernateJpaVendorAdapter();
vendorAdapter.setDatabase(Database.POSTGRESQL);
vendorAdapter.setGenerateDdl(true);
LocalContainerEntityManagerFactoryBean factory = new LocalContainerEntityManagerFactoryBean();
factory.setJpaVendorAdapter(vendorAdapter);
factory.setPackagesToScan(getClass().getPackage().getName());
factory.setDataSource(dataSource());
return factory;
}
#Bean
#Autowired
public JpaTransactionManager transactionManager() {
JpaTransactionManager txManager = new JpaTransactionManager();
txManager.setEntityManagerFactory(entityManagerFactory().getObject());
return txManager;
}
}
I have a classic model and here is the Repository :
package com.spring.persistence.repositories;
import com.spring.persistence.model.ApplicationUser;
import org.springframework.beans.factory.annotation.Qualifier;
import org.springframework.data.jpa.repository.JpaRepository;
import org.springframework.stereotype.Repository;
#Repository
#Qualifier(value = "applicationUserRepository")
public interface ApplicationUserRepository extends JpaRepository<ApplicationUser,Long>{
}
A Simple service :
package com.spring.persistence.service;
import com.spring.persistence.model.ApplicationUser;
import com.spring.persistence.repositories.ApplicationUserRepository;
import java.util.List;
import javax.annotation.PostConstruct;
import javax.transaction.Transactional;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
#Service
#Transactional
public class ApplicationUserService {
private final ApplicationUserRepository applicationUserRepository;
#Autowired
public ApplicationUserService(ApplicationUserRepository applicationUserRepository) {
this.applicationUserRepository = applicationUserRepository;
}
public ApplicationUser createUser(String username, String type, String country)
{
ApplicationUser user = new ApplicationUser(username,type,country);
return applicationUserRepository.saveAndFlush(user);
}
public List<ApplicationUser> getAllUser()
{
return applicationUserRepository.findAll();
}
public ApplicationUser getUser(Long id)
{
ApplicationUser user = null;
if(id != null)
{
user = applicationUserRepository.findOne(id);
}
return user;
}
public boolean deleteUser(Long id)
{
if(id != null)
{
try{
applicationUserRepository.delete(id);
return true;
}
catch(IllegalArgumentException ex)
{
ex.printStackTrace();
return false;
}
}
else
{
System.out.println("Id is null");
return false;
}
}
}
And finally the WebController :
#RestController
#RequestMapping(value = "/applicationuser")
public class ApplicationUserController {
#Autowired
private ApplicationUserService applicationUserService;
#RequestMapping(value="/",method = RequestMethod.GET)
#ResponseBody
public ApplicationUser index()
{
return applicationUserService.createUser("test", "test", "test");
}
}
It's possible is missing lot of things (Annotation,Initializer,Code) but I'm here to learn and any advices can help me.
Thanks for your answers
Spring Data REST
This project will allow you to achieve your goals with significantly less boilerplate code.
Follow the Accessing JPA Data with REST guide which demonstrates how to configure Spring Boot + Spring Data REST with absolutely minimal configuration.
Once you have a basic understanding, then you can add more functionality to meet your business requirements.
Detailed information is provided in the Spring Data REST Documentation

SpringWeb Jmustache and #DateTimeFormat

I have a spring boot application using server-side Mustache-Templates (JMustache).
A simple Bean with an #DateTimeFormat-Annotation:
import java.util.Date;
import org.springframework.format.annotation.DateTimeFormat;
public class GeneralInformation {
private Date serverTime = new Date();
#DateTimeFormat(pattern="dd.MM.yyyy")
public Date getServerTime() {
return serverTime;
}
public void setServerTime(Date serverTime) {
this.serverTime = serverTime;
}
}
A simple controller adding the bean to the model:
#Controller
#RequestMapping(value="/")
public class RootController {
// some Autowiring stuff here...
#RequestMapping(value="")
public String index(Model model){
model.addAttribute("generalInformation", new GeneralInformation());
return "hello";
}
}
And my Server-Side Mustache-template stored under templates/hello.html
<p>Servertime: {{generalInformation.serverTime}}</p>
When using JSP's the output of the date is formatted after the pattern used in the #DateTimeFormat-Annotation but not when using my Mustache-Template.
I could format the date in the #Controller-Annotated-Method and store it as a String in the Bean, but that doesn't seem to be a good way.
Does anybody know, if it is possible to make JMustache aware of the Validation-Tags?
How else could I achieve Formatting when using JMustache together with SpringMVC?
#DateTimeFormat only works with JSP

Resources