Spring JdbcTemplate resultset empty first time - jdbctemplate

Put simply, the problem is that the result set is empty the first time around I call the JdbcTemplate and query the database.
When I access the DAO methods for the second time, I get the expected results. Here is more information about how the classes are setup:
I have a data access object that extends a parent data access object. Parent dao class simply injects the data-sources into the constructors of JdbcTemplate:
public class BaseDao
{
private JdbcTemplate usrJdbcTemplate;
public void setUsrDataSource(DataSource usrDataSource)
{
this.usrJdbcTemplate = new JdbcTemplate(usrDataSource);
}
public JdbcTemplate getUsrJdbcTemplate()
{
return this.usrJdbcTemplate;
}
}
The class that extends this, makes use of this JdbcTemplate to query a table:
public class OimUserDao extends BaseDao
{
public Date getPasswordExpiryDate(String userName)
{
String sql = "select USR_PWD_EXPIRE_DATE from USR where UPPER (USR_LOGIN) = ?";
List<java.sql.Date> dtLst = getUsrJdbcTemplate().query(sql, new Object[] {userName.toUpperCase()}, new RowMapper<java.sql.Date>()
{
#Override
public java.sql.Date mapRow(final ResultSet rs, int rowNum) throws SQLException
{
return rs.getDate(1);
}
});
if (dtLst.size()>0)
{
return dtLst.get(0);
}
else
{
return null;
}
}
}
The DAO is autowired in a service using the #Autowired annotation. DAO declarations in the xml:
<bean id="baseDao" class="us.worldpay.portalgateway.dao.BaseDao">
<property name="usrDataSource" ref="usrDataSource" />
</bean>
<bean id="oimUserDao" class="us.worldpay.portalgateway.dao.OimUserDao" parent="baseDao" />
Web.xml has the declaration for the xml containing the bean definitions for these DAO (pg-data is the one we are concerned with):
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>
/WEB-INF/pg-servlet.xml,
/WEB-INF/pg-data.xml
</param-value>
</context-param>
To make things worse, this NEVER happens on my local box (Weblogic) that points to the same database (Oracle) as our QA env does. The result set the first time around is populated and has data when I run this on my local box. When I run this on the QA env (Weblogic), the result set is empty the first time. I have spent hours and hours on this, and have gotten no where.
I appreaciate your time reading this post. I am grateful for all help I can get.

I see no reason for the BaseDao. I'd dispense with it and simply inject the SimpleJdbcTemplate into the DAOs that need it.
You should use PreparedStatement rather than escaping username on your own. I wouldn't use a LIKE clause in the query. Your username should be unique; you should get one Date back.
OK, let's ask some more questions.
You say the ResultSet is empty "the first time". Are you saying that if you merely re-run the method call that it's not empty? "I think so, yes." You need to be certain. It's central to your question.
Are you sure that your local machine and QA environment are pointing at the very same Oracle instance? "That I am very sure of. Yes." Very good!
If you log onto Oracle with a client and run the identical query, does it bring back the results you expect? "Yes it does." Excellent!
Have you stepped through the code with a debugger to see what's happening? I'd recommend it highly.
One more question: Is there an exception thrown that you're not catching? Spring wraps SQLException into an unchecked DataAccessException. Maybe something bad happens the first time that's rectified on the second call. I'm just grasping at straws here.

Related

cleanup sqlite db file on close

Can we cleanup {filename}.db as soon as prototype bean goes out of scope in spring boot project with jooq starter ?
The destroy method needs to get handle of filename.
Tried putting ;DB_CLOSE_DELAY=-1 at end of URL but does not seem to work with sqlite files. Expected some value of DB_CLOSE_DELAY that deletes file at end or does it in-memory.
#Bean
public Function<String, DSLContext> dslFactory() {
return this::dsl;
}
#Bean
#Scope("prototype")
#ConfigurationProperties("datasource")
public DefaultDSLContext dsl(String filename) {
DataSource dataSource = DataSourceBuilder.create()
.url("jdbc:sqlite:" + filename + ".db")
.build();
DefaultConfiguration jooqConfiguration = new DefaultConfiguration();
jooqConfiguration.set(new DataSourceConnectionProvider(new TransactionAwareDataSourceProxy(dataSource)));
jooqConfiguration.set(new DefaultExecuteListenerProvider(new ExceptionTranslator()));
DefaultDSLContext context = new DefaultDSLContext(jooqConfiguration);
return context;
}
Usage :
#Autowired
private Function<String, DSLContext> dslFactory;
DSLContext dsl = dslFactory.apply("xxx");
Tried overriding in prototype bean declaration DefaultExecuteListener.end but it gets called on each dsl execute(). Something like below would have been ideal - using lombok cleanup when final dslContext goes out of scope i.e at end of method invocation having context.getBean/apply as above, then {filename}.db is deleted.
#Cleanup DefaultDSLContext context = new DefaultDSLContext(jooqConfiguration);
That's a rather different question from the one you've asked, and it already has an answer here: stackoverflow.com/q/8831514/521799 – Lukas Eder
According to link provided, we can use ::memory: at end of url to create in-memory sqlite files, requiring no cleanup otherwise necessary.

springboot+mybatis, About Native Dao development

I'm trying a new development method.
In mybatis3, I write mapper.java and mapper.xml usually.
I know, the sql statements is corresponded by sqlId(namespace+id).
I want to execute the sql statement like this :
SqlSession sqlSession = sessionFactory.openSession();
return sqlSession.selectList(sqlId, param);
but I get a error:
Cause: java.lang.IllegalArgumentException: Mapped Statements collection does not contain value for mapper.JinBoot.test
at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:30)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:150)
at org.apache.ibatis.session.defaults.DefaultSqlSession.selectList(DefaultSqlSession.java:141)
at cn.tianyustudio.jinboot.dao.BaseDao.select(BaseDao.java:20)
at cn.tianyustudio.jinboot.service.BaseService.select(BaseService.java:10)
at cn.tianyustudio.jinboot.controller.BaseController.test(BaseController.java:21)
here is my BaseDao.java
public class BaseDao {
private static SqlSessionFactoryBean factoryBean = new SqlSessionFactoryBean();
public static List<Map> select(String sqlId, Map param) {
try {
factoryBean.setDataSource(new DruidDataSource());
SqlSessionFactory sessionFactory = factoryBean.getObject();
SqlSession sqlSession = sessionFactory.openSession();
return sqlSession.selectList(sqlId, param);
} catch (Exception e) {
e.printStackTrace();
}
return null;
}
}
here is UserMapper.xml
<mapper namespace="mapper.JinBoot">
<select id="test" parameterType="hashMap" resultType="hashMap">
select * from user
</select>
</mapper>
the application.properties
mybatis.mapperLocations=classpath:mapper/*.xml
I start the project, the send a http request, after controller and service ,the param 'sqlId' in BaseDao is 'mapper.JinBoot.test' (see error info).
In method 'BaseDao.select', both the parameter and the result type is Map.
So I don't want to create UserMapper.java, I want try it.
How can I resolve it? What's the malpractice of this way?
This does not work because spring boot creates its own SqlSessionFactory. And the option in application.properties that specifies where mappers should be looked for is only set for that SqlSessionFactory. You are creating unrelated session factory in your DAO and it does not know where to load mappers definition.
If you want to make it work you need that you DAO is spring managed so that you can inject mybatis session factory into it and use it in select. This would also require that you convert select into non static method.
As I understand you want to have only one method in you base DAO class and use it in individual specific DAO classes. I would say it makes little sense. If the method returns Map there will be some place that actually maps this generic type to some application specific types. This would probably be in the child DAOs. So you still need to create the API of the child DAO with the signature that uses some input parameters and returns some domain objects. And that's exactly what you want to avoid by not creating mybatis mapper classes.
The thing is that you can treat your mytabis mappers as DAOs. That is you mappers would be your DAOs. And you don't need another layer. As I understand now you have two separate layers - DAO and mappers and you want to remove boilerplate code. I think it is better to remove DAO classes. They are real boilerplate and mybatis mapper can serve as DAO perfectly. You inject it directly to you service and service depends only on the mapper class. The logic of the mapping is in the mapper xml file. See also answer to this question Can Spring DAO be merged into Service layer?

how to dynamically register Feed Inbound Adapter in Spring Integration?

I'm trying to implement an RSS/Atom feed aggregator in spring-integration and I am primarily using the Java DSL to write my IntegrationFlow. A requirement of this aggregator is that feeds can be added / removed during runtime. That is to say, the feeds are not known at design time.
I found it simple to use the basic Feed.inboundAdapter() with a test url and extract the links out of the feed with a transformer and then pass it on to an outbound-file-adapter to save the links to a file. However, I have gotten very stuck when trying to read the (thousands) of feed urls from an inbound-file-adapter run the file through a FileSplitter and then pass each resulting Message<String> containing the feed url to then register a new Feed.inboundAdapter(). Is this not possible with the Java DSL?
Ideally I would love it if I could do the following:
#Bean
public IntegrationFlow getFeedsFromFile() throws MalformedURLException {
return IntegrationFlows.from(inboundFileChannel(), e -> e.poller(Pollers.fixedDelay(10000)))
.handle(new FileSplitter())
//register new Feed.inboundAdapter(payload.toString()) foreach Message<String> containing feed url coming from FileSplitter
.transform(extractLinkFromFeedEntry())
.handle(appendLinkToFile())
.get();
}
Though after reading through the spring integration java DSL code multiple times (and learning a tonne of stuff along the way) I just can't see that it's possible to do it this way. So... A) is it? B) should it be? C) Suggestions?
It almost feels like I should be able to take the output of .handle(new FileSplitter()) and pass that into .handleWithAdapter(Feed.inboundAdapter(/*stuff here*/)) but the DSL only references outbound-adapters there. Inbound adapters are really just a subclass of AbstractMessageSource and it seems the only place you can specify one of those is as an argument to the IntegrationFlows.from(/*stuff here*/) method.
I would have thought it would be possible to take the input from a file, split it line by line, use that output to register inbound feed adapters, poll those feeds, extract the new links from feeds as they appear and append them to a file. It appears as though it's not.
Is there some clever subclassing I can do to make this work??
Failing that... and I suspect this is going to be the answer, I found the spring integration Dynamic Ftp Channel Resolver Example and this answer on how to adapt it dynamically register stuff for the inbound case...
So is this the way to go? Any help/guidance appreciated. After pouring over the DSL code and reading documentation for days, I think I'll have a go at implementing the dynamic ftp example and adapting it to work with FeedEntryMessageSource... in which case my question is... that dynamic ftp example works with XML configuration, but is it possible to do it with either Java config or the Java DSL?
Update
I've implemented the solution as follows:
#SpringBootApplication
class MonsterFeedApplication {
public static void main(String[] args) throws IOException {
ConfigurableApplicationContext parent = SpringApplication.run(MonsterFeedApplication.class, args);
parent.setId("parent");
String[] feedUrls = {
"https://1nichi.wordpress.com/feed/",
"http://jcmuofficialblog.com/feed/"};
List<ConfigurableApplicationContext> children = new ArrayList<>();
int n = 0;
for(String feedUrl : feedUrls) {
AnnotationConfigApplicationContext child = new AnnotationConfigApplicationContext();
child.setId("child" + ++n);
children.add(child);
child.setParent(parent);
child.register(DynamicFeedAdapter.class);
StandardEnvironment env = new StandardEnvironment();
Properties props = new Properties();
props.setProperty("feed.url", feedUrl);
PropertiesPropertySource pps = new PropertiesPropertySource("feed", props);
env.getPropertySources().addLast(pps);
child.setEnvironment(env);
child.refresh();
}
System.out.println("Press any key to exit...");
System.in.read();
for (ConfigurableApplicationContext child : children) {
child.close();
}
parent.close();
}
#Bean
public IntegrationFlow aggregateFeeds() {
return IntegrationFlows.from("feedChannel")
.transform(extractLinkFromFeed())
.handle(System.out::println)
.get();
}
#Bean
public MessageChannel feedChannel() {
return new DirectChannel();
}
#Bean
public AbstractPayloadTransformer<SyndEntry, String> extractLinkFromFeed() {
return new AbstractPayloadTransformer<SyndEntry, String>() {
#Override
protected String transformPayload(SyndEntry payload) throws Exception {
return payload.getLink();
}
};
}
}
DynamicFeedAdapter.java
#Configuration
#EnableIntegration
public class DynamicFeedAdapter {
#Value("${feed.url}")
public String feedUrl;
#Bean
public static PropertySourcesPlaceholderConfigurer pspc() {
return new PropertySourcesPlaceholderConfigurer();
}
#Bean
public IntegrationFlow feedAdapter() throws MalformedURLException {
URL url = new URL(feedUrl);
return IntegrationFlows
.from(s -> s.feed(url, "feedTest"),
e -> e.poller(p -> p.fixedDelay(10000)))
.channel("feedChannel")
.get();
}
}
And this works IF and only IF I have one of the urls defined in application.properties as feed.url=[insert url here]. Otherwise it fails telling me 'unable to resolve property {feed.url}'. I suspect what is happening there is that the #Beans defined in DynamicFeedAdapter.java all get singletons eagerly initialized, so aside from the beans being manually created in our for loop in the main method (which work fine because they have feed.url property injected) we have a stray singleton that has been eagerly initialized and if there is no feed.url defined in application.properties then it can't resolve the property and everything goes bang. Now from what I know of Spring, I know it should be possible to #Lazy initialize the beans in DynamicFeedAdapter.java so we don't wind up with this one unwanted stray singleton problem-child. The problem is now...if I just mark the feedAdapter() #Lazy then the beans never get initialized. How do I initialize them myself?
Update - problem solved
Without having tested it, I think the problem is that boot is finding
the DynamicFeedAdapter during its component scan. A simple solution is
to move it to a sibling package. If MonsterFeedApplication is in
com.acme.foo, then put the adapter config class in com.acme.bar. That
way, boot won't consider it "part" of the application
This was indeed the problem. After implementing Gary's suggestion, everything works perfect.
See the answer to this question and its follow up for a similar question about inbound mail adapters.
In essence, each feed adapter is created in a child context that is parameterized.
In that case the child contexts are created in a main() method but there's no reason it couldn't be done in a service invoked by .handle().

Scheduled database maintenance with Java EE 6 (connection lifetime)

I'm new to Java EE 6 so I apologize if the answer to this question is obvious. I have a task that must run hourly to rebuild a Solr index from a database. I also want the rebuild to occur when the app is deployed. My gut instinct is that this should work:
#Singleton
#Startup
public class Rebuilder {
#Inject private ProposalDao proposalDao;
#Inject private SolrServer solrServer;
#Schedule(hour="*", minute="0", second="0")
public void rebuildIndex() {
// do the rebuild here
}
}
Since I'm using myBatis, I have written this producer:
public class ProposalSessionProvider {
private static final String CONFIGURATION_FILE = "...";
static {
try {
sessFactory = new SqlSessionFactoryBuilder().build(
Resources.getResourceAsReader(CONFIGURATION_FILE));
}
catch (IOException ex) {
throw new RuntimeException("Error configuring MyBatis: " + ex.getMessage(), ex);
}
}
#Produces
public ProposalsDao openSession() {
log.info("Connecting to the database");
session = sessFactory.openSession();
return session.getMapper(ProposalsDao.class);
}
}
So I have three concerns:
What's the appropriate way to trigger a rebuild at deployment time? A #PostConstruct method?
Who is responsible for closing the database connection, and how should that happen? I'm using myBatis which is, I believe, pretty ignorant of the Java EE lifecycle. It seems like if I use #Singleton the connections will never be released, but is it even meaningful to put #Startup on a #Stateless bean?
Should the Rebuilder be a singleton or not? It seems like if it is not I couldn't use #PostConstruct to handle the initial rebuild or I'll get double rebuilds every hour.
I'm not really sure how to proceed here. Thanks for your time.
I don't know myBatis but i can tell you than #Schedule job is transactional. Anyway i'am not sure that JTA managed transaction will apply here according to the way you retrieve the session. Isn't there a way to retrieve a persistenceContext in MyBatis ? For the trigger part IMHO #Startup will do the job properly and will need a singleton bean so. Anyway i'am not able to tell you which of the 2 methods you propose is the best one.
For the scheduling part, you are correct; I'd write the index building logic in a separate class, and have both a (Singleton?) #StartUp bean and a #Schedule-annotated method in a separate class call it.
JMS could be used by said beans to trigger the index rebuilding, if you don't want to have a dependency between the index-building code, and the triggering code in said classes.
I don't know myBatis well enough, but if your connection is managed by a data source #Resource, then I believe it could indeed benefit from CMT.

Creating (mutable) Spring Security ACLs fails half the time

I'm using JDBC backed SpringSecurity ACLs to manage my user-created objects.
I have a #Service, which deals with the CRUD of my ACL-protected objects, and so needs to generate appropriate ACLs and store them. I've marked the entire class as #Transactional, configured in my spring-security.xml as
<bean id="oauthTXManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<tx:annotation-driven transaction-manager="oauthTXManager" />
That dataSource is working (promise!), it's a Postgres DB, if that could be important.
So back to the #Service. It looks like this (in part):
#Autowired
#Qualifier("aclService")
private MutableAclService aclService;
...
public Store createNewProfileWithOwner(Store profile, User owner) {
try {
Connection con = dataSource.getConnection();
PreparedStatement query = con.prepareStatement(PROFILE_INSERT);
...
query.executeUpdate();
Sid sid = new PrincipalSid(owner.getUsername());
Permission p = BasePermission.ADMINISTRATION;
ObjectIdentity oi = new ObjectIdentityImpl(profile);
MutableAcl acl = null;
try {
acl = (MutableAcl) aclService.readAclById(oi);
} catch (NotFoundException e) {
acl = aclService.createAcl(oi);
}
acl.setOwner(sid);
acl.insertAce(acl.getEntries().size(), p, sid, true);
aclService.updateAcl(acl);
profile.setOwner(owner.getUsername());
...
} catch (SQLException e) {
e.printStackTrace();
}
return profile;
}
I have a wee script, which tests the API that calls this method. Approximately half of the times I run the script, I get an error at acl = aclService.createAcl(oi), where having created the ACL SpringSecurity tries to read it back but can't find it. Much like the problem described in this forum. The other 50% of the time it works just fine. The best I could narrow it down more than "randomly, about half the time", was that if I ran the script, and it didn't work, and then ran it again more than four seconds but less than twenty seconds later, it would work.
Weirdly, when I check the DB, the id that SpringSecurity claims it can't find in the DB is definitely there.
I'm assuming that I'm running up against some kind of Transaction or caching issue. I read Section 10.5.6 of Transaction Management but I'm afraid it didn't really help me pinpoint where the error might be.
Any and all advice welcome.
I "fixed" it by making all the methods synchronized and also tipped off by SpringSecurity #Secured and #PreAuthorized I forced #PreAuthrorized to be evaluated before #Transactional.
So far, it works.

Resources