Creating (mutable) Spring Security ACLs fails half the time - spring-mvc

I'm using JDBC backed SpringSecurity ACLs to manage my user-created objects.
I have a #Service, which deals with the CRUD of my ACL-protected objects, and so needs to generate appropriate ACLs and store them. I've marked the entire class as #Transactional, configured in my spring-security.xml as
<bean id="oauthTXManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<tx:annotation-driven transaction-manager="oauthTXManager" />
That dataSource is working (promise!), it's a Postgres DB, if that could be important.
So back to the #Service. It looks like this (in part):
#Autowired
#Qualifier("aclService")
private MutableAclService aclService;
...
public Store createNewProfileWithOwner(Store profile, User owner) {
try {
Connection con = dataSource.getConnection();
PreparedStatement query = con.prepareStatement(PROFILE_INSERT);
...
query.executeUpdate();
Sid sid = new PrincipalSid(owner.getUsername());
Permission p = BasePermission.ADMINISTRATION;
ObjectIdentity oi = new ObjectIdentityImpl(profile);
MutableAcl acl = null;
try {
acl = (MutableAcl) aclService.readAclById(oi);
} catch (NotFoundException e) {
acl = aclService.createAcl(oi);
}
acl.setOwner(sid);
acl.insertAce(acl.getEntries().size(), p, sid, true);
aclService.updateAcl(acl);
profile.setOwner(owner.getUsername());
...
} catch (SQLException e) {
e.printStackTrace();
}
return profile;
}
I have a wee script, which tests the API that calls this method. Approximately half of the times I run the script, I get an error at acl = aclService.createAcl(oi), where having created the ACL SpringSecurity tries to read it back but can't find it. Much like the problem described in this forum. The other 50% of the time it works just fine. The best I could narrow it down more than "randomly, about half the time", was that if I ran the script, and it didn't work, and then ran it again more than four seconds but less than twenty seconds later, it would work.
Weirdly, when I check the DB, the id that SpringSecurity claims it can't find in the DB is definitely there.
I'm assuming that I'm running up against some kind of Transaction or caching issue. I read Section 10.5.6 of Transaction Management but I'm afraid it didn't really help me pinpoint where the error might be.
Any and all advice welcome.

I "fixed" it by making all the methods synchronized and also tipped off by SpringSecurity #Secured and #PreAuthorized I forced #PreAuthrorized to be evaluated before #Transactional.
So far, it works.

Related

Export IIS in-process session data while debugging

I have production IIS server with some ASP.NET MVC app. I got some tricky bug which I can't capture. It's linked to session data. How can I export/see/view such user session? There is default IIS configuration for session storing -- in-process.
EDIT
By the way I have necessary appropriate user session ID.
EDIT2
Ok, guys, so even if I can't export that data right now, could you please point me at some session state server or something similar, which I can use for storing session data and view it further?
I kniw SQL Server can, but it is very heavy for such issue.
Chris is right following on his Idea, you could write a routine that would output the content of your session objects to a file (a kind of a custom log).
//Controller Action where you store some objects in session
public ActionResult Index()
{
var myObj = new { strTest = "test string", dtTestValue = DateTime.Now, listTest = new List<string>() { "list item 1", "list item 2", "list item 3" }};
Session["test1"] = "Test";
Session["test2"] = myObj;
return View();
}
//Controller Action where you output session objects to a file
[HttpPost]
public ActionResult Index(FormCollection form)
{
//Routine to write each sessionObject serialized as json to a file
foreach (string key in Session.Keys)
{
var obj = Session[key];
JavaScriptSerializer serializer = new JavaScriptSerializer();
using (System.IO.StreamWriter file = new System.IO.StreamWriter(#"C:\Users\Public\CustomAspNetLog.txt", true))
{
file.WriteLine(DateTime.Now.ToString() + "\t" + serializer.Serialize(obj));
}
}
return View();
}
If you need to call that routine often, you can put it in some helper class and call it whenever you want in your controller actions. Then you are able to inspect true data inside Session at every step you find necessary.
No, you will need to write a routine to export the session data as and when required.
KSeen
There is a better option to store sessions than a StateServer i.e Distributed Cache provider.
Alachisoft provides NCache Express which is totally free.You can use it to store your sessions.Here is how you do it.
Install NCache on each web server.
Define a distributed cache: Make sure you test the distributed cache to ensure it is properly working.
Modify web.config file: to add the SessionState Provider information and the name of the cache you've just created.
<sessionState cookieless="false" regenerateExpiredSessionId="true"
mode="Custom"
customProvider="NCacheSessionProvider" timeout="1">
<providers>
<add name="NCacheSessionProvider"
type="Alachisoft.NCache.Web.SessionState.
NSessionStoreProvider"
cacheName="myreplicatedcache"
writeExceptionsToEventLog="false"
AsyncSession="false"/>
</providers>
</sessionState>
Please note that Version=3.2.1.0 should match the specific NCache Express version you've downloaded. Once you do this, you're ASP.NET application is ready to start using distributed sessions.

Spring JdbcTemplate resultset empty first time

Put simply, the problem is that the result set is empty the first time around I call the JdbcTemplate and query the database.
When I access the DAO methods for the second time, I get the expected results. Here is more information about how the classes are setup:
I have a data access object that extends a parent data access object. Parent dao class simply injects the data-sources into the constructors of JdbcTemplate:
public class BaseDao
{
private JdbcTemplate usrJdbcTemplate;
public void setUsrDataSource(DataSource usrDataSource)
{
this.usrJdbcTemplate = new JdbcTemplate(usrDataSource);
}
public JdbcTemplate getUsrJdbcTemplate()
{
return this.usrJdbcTemplate;
}
}
The class that extends this, makes use of this JdbcTemplate to query a table:
public class OimUserDao extends BaseDao
{
public Date getPasswordExpiryDate(String userName)
{
String sql = "select USR_PWD_EXPIRE_DATE from USR where UPPER (USR_LOGIN) = ?";
List<java.sql.Date> dtLst = getUsrJdbcTemplate().query(sql, new Object[] {userName.toUpperCase()}, new RowMapper<java.sql.Date>()
{
#Override
public java.sql.Date mapRow(final ResultSet rs, int rowNum) throws SQLException
{
return rs.getDate(1);
}
});
if (dtLst.size()>0)
{
return dtLst.get(0);
}
else
{
return null;
}
}
}
The DAO is autowired in a service using the #Autowired annotation. DAO declarations in the xml:
<bean id="baseDao" class="us.worldpay.portalgateway.dao.BaseDao">
<property name="usrDataSource" ref="usrDataSource" />
</bean>
<bean id="oimUserDao" class="us.worldpay.portalgateway.dao.OimUserDao" parent="baseDao" />
Web.xml has the declaration for the xml containing the bean definitions for these DAO (pg-data is the one we are concerned with):
<context-param>
<param-name>contextConfigLocation</param-name>
<param-value>
/WEB-INF/pg-servlet.xml,
/WEB-INF/pg-data.xml
</param-value>
</context-param>
To make things worse, this NEVER happens on my local box (Weblogic) that points to the same database (Oracle) as our QA env does. The result set the first time around is populated and has data when I run this on my local box. When I run this on the QA env (Weblogic), the result set is empty the first time. I have spent hours and hours on this, and have gotten no where.
I appreaciate your time reading this post. I am grateful for all help I can get.
I see no reason for the BaseDao. I'd dispense with it and simply inject the SimpleJdbcTemplate into the DAOs that need it.
You should use PreparedStatement rather than escaping username on your own. I wouldn't use a LIKE clause in the query. Your username should be unique; you should get one Date back.
OK, let's ask some more questions.
You say the ResultSet is empty "the first time". Are you saying that if you merely re-run the method call that it's not empty? "I think so, yes." You need to be certain. It's central to your question.
Are you sure that your local machine and QA environment are pointing at the very same Oracle instance? "That I am very sure of. Yes." Very good!
If you log onto Oracle with a client and run the identical query, does it bring back the results you expect? "Yes it does." Excellent!
Have you stepped through the code with a debugger to see what's happening? I'd recommend it highly.
One more question: Is there an exception thrown that you're not catching? Spring wraps SQLException into an unchecked DataAccessException. Maybe something bad happens the first time that's rectified on the second call. I'm just grasping at straws here.

How to make a Custom Deployer to write data to MS SQL database?

I've added a custom module in the default processor in config/cd_deployer_conf.xml:
<Processor Action="Deploy" Class="com.tridion.deployer.Processor">
...
<Module Type="MyCustomModuleDeploy" Class="com.tridion.new.extensions.MyCustomModule">
</Module>
</Processor>
The code for the module looks something like this:
public class MyCustomModule extends com.tridion.deployer.modules.PageDeploy {
static Logger logger = Logger.getLogger("customDeployerFirst");
public MyCustomModule(Configuration config, Processor processor)
throws ConfigurationException {
super(config, processor);
// TODO Auto-generated constructor stub
}
public void processPage(Page page, File pageFile) throws ProcessingException {
// TODO Auto-generated method stub
Date d = new Date();
log.info("WORKING");
log.info("Page ID: " + page.getId().toString());
log.info("Publication date: "+ d.toString());
}
}
In my log file I get the info I wanted, every time a page is published.
What I want to do next is to write the page ID and publication date to my Microsoft SQL database, in a table I previously created. How can I do that? How can I access the database table from MyCustomModule?
Thanks
Not sure of your requirement, but you already chose the deployer extension model vs storage extensions. With storage extensions, Tridion will provide a model on how you can extend storages (like JPAFramework and Base DAOEntities that you can extend). If you are going the deployer extension route, as Quirin and Nuno mentioned it is like writing a standard JDBC code like any other app.
But, I would suggest you also look at storage extension model and see if it fits your requirement. A very good starting point is to look at the below article: http://www.sdltridionworld.com/articles/sdltridion2011/tutorials/extending-content-delivery-storage-sdltridion-2011-1.aspx
Ok, so what I did here to solve my problem is exactly what Quirijn suggested. Used the JDBC driver to connect to my database and then executed an sql update query:
int sqlStatement = st.executeUpdate("insert into Pages values ('" + page.getId().toString()+ "', '" + ... + "')");
This way each time a Page is published some data will be written to my database.

NHibernate: Get all opened sessions

I have an ASP.NET application with NHibernate, for some reason few developers forgot to close the sessions in some pages (like 20 I think), I know that the best solution is to go through each page and make sure the sessions are closed properly, but I can't do that kind of movement because the code is already on production. So I was trying to find a way to get all the opened sessions in the session factory and then close it using the master page or using an additional process but I can't find a way to do that.
So, is there a way to get all the opened sessions? or maybe set the session idle timeout or something, what do you suggest?. Thanks in advice.
As far as I know, there is no support for getting a list of open sessions from the session factory. I have my own method to keep an eye on open sessions and I use this construction:
Create a class with a weak reference to a ISession. This way you won't interupt the garbage collector if sessions are being garbage collected:
public class SessionInfo
{
private readonly WeakReference _session;
public SessionInfo(ISession session)
{
_session = new WeakReference(session);
}
public ISession Session
{
get { return (ISession)_session.Target; }
}
}
create a list for storing your open sessions:
List<SessionInfo> OpenSessions = new List<SessionInfo>();
and in the DAL (data access layer) I have this method:
public ISession GetNewSession()
{
if (_sessionFactory == null)
_sessionFactory = createSessionFactory();
ISession session = _sessionFactory.OpenSession();
OpenSessions.Add(new SessionInfo(session));
return session;
}
This way I maintain a list of open sessions I can query when needed. Perhaps this meets your needs?

Will transactionscope work over multiple calls to different services?

I'm writing some merge functionality in C# asp.NET MVC2. I am also using using Linq2SQL.
I have a block of code which calls two services, MessageService and UserService. These both in term call their appropriate repositories and make the amendments to the db. Each repository declares it's own instance of the repository so I'm thinking this will escalate the following code to DTC . The code is called from the AccountService, is this going to work at this level? And also is it bad practise to declare the DataContext at the top of every repository or should I pass the object around somehow? Thank you in advance
//Run the merge
try
{
using (TransactionScope scope = new TransactionScope())
{
// Update Messages to be owned by primary user
if (!_MessageService.UpdateCreatedById(MergeUser.UserID, CoreUser.UserID))
{
return false;
}
// Update Comments to be owned by primary user
foreach (var curComment in _MessageService.GetUserComments(MergeUser.UserID))
{
curComment.CreatedBy = CoreUser.UserID;
}
_MessageService.Save();
// Update Logins to be owned by primary user
foreach (var CurLogin in _UserService.GetLogins(MergeUser.UserID))
{
CurLogin.UserID = CoreUser.UserID;
}
_UserService.Save();
scope.Complete();
}
return true;
}
catch (Exception ex)
{
_ErrorStack.Add(ex.Message);
ErrorService.AddError(new ErrorModel("Portal", "WidgetRepository", ErrorHelper.ErrorTypes.Critical, ex));
return false;
}
Yes, This will work. TransactionScope leverages the Distributed transaction coordinator so it's capable of hosting Transactions beyond database levels.
Recommended practice for DataContext lifecycle is to restrict it to a unit-of-work.
I have two constructors on my repositories, one that takes a data context and one that does not (which then instatiates it's own). This means that I can create repositories using a shared data context as required.
My service classes then take a repository object in their constructor, so I can instantiate several services using repositories that are sharing a data context, if so required.

Resources