How to invalidate ASP.NET cache if data changes in PostgreSql database - asp.net

ASP.NET/MONO MVC2 application standard ASP.NET Web cache is used to speed up database access:
string GetName() {
// todo: dedect if data has changed and invalidate cache
var name = (string)HttpContext.Current.Cache["Name"];
if (name!=null)
return name;
name = db.Query("SELECT name from mydata");
HttpContext.Current.Cache.Insert("Name", name);
return name;
}
mydata can changed by other application.
In this case this method returns wrong data.
How to detect if data is changed and return fresh data from PostgreSql database in this case ?
It is OK to clear whole Web cache if mydata has changed.

The best way to do this is likely with LISTEN and NOTIFY.
Have your app maintain a background worker with a persistent connection to the DB. In that connection, issue a LISTEN name_changed, then wait for notifications. If npgsql supports it it might offer a callback; otherwise you'll have to poll.
Add a trigger to the name table that issues a NOTIFY name_changed.
When your background worker gets the notification it can flush the cache.
You can even use the NOTIFY payload to invalidate only changed entries selectively.

Related

How can I open database connection at runtime?

I am working on ASP.Net MVC project. I am making my database transactions with;
using (ISession session = FluentNHibernateHelper.OpenSession())
{
var user = session.Query<User>()
.FirstOrDefault(x => x.UserEmail == email && x.UserPassword == password); }
Instead of using this type of code block which is like open-close connection everytime, I want to open connection at runtime and I want to use that session variable everywhere. Maybe some codes in Application_Start()in Global.asax.cs?
I am open to your valuable ideas. Thank you for your help!
It's poor practice to leave a connection open nor maintain the state in the ORM across multiple transactions as state issues can crop up rather quickly as you make multiple requests against the same object & connection.
However, if you must, you could inject it as a Singleton service which would live longer than a single request. This is problematic for scaling and not recommended.
services.AddSingleton<ISession>(provider =>
{
return FluentNHibernateHelper.OpenSession()
});
More information: What is the difference between services.AddTransient, service.AddScoped and service.AddSingleton methods in ASP.NET Core?

Meteor signaling without db write

I've been looking for a good way to do, but haven't found anything that doesn't seem hacky. I want to signal the client without going through the database and a subscription. For example, in a game I want to send a message to the client to display "Player 1 almost scores!". I don't care about this information in the long run, so I don't want to push it to the DB. I guess I could just set up another socket.io, but I'd rather not have to manage a second connection if there is a good way to go it within meteor. Thanks! (BTW, have looked at Meteor Streams, but it appears to have gone inactive)
You know that Meteor provides real-time communication from the server to clients through Publish and Subscribe mechanism, which is typically used to send your MongoDB data and later modifications.
You would like a similar push system but without having to record some data into your MongoDB.
It is totally possible re-using the Meteor Pub/Sub system but without the database part: while with Meteor.publish you typically return a Collection Cursor, hence data from your DB, you can also use its low-level API to send arbitrary real-time information:
Alternatively, a publish function can directly control its published record set by calling the functions added (to add a new document to the published record set), changed (to change or clear some fields on a document already in the published record set), and removed (to remove documents from the published record set). […]
Simply do not return anything, use the above mentioned methods and do not forget calling this.ready() by the end of your publish function.
See also the Guide about Custom publications
// SERVER
const customCollectionName = 'collection-name';
let sender; // <== we will keep a reference to the publisher
Meteor.publish('custom-publication', function() {
sender = this;
this.ready();
this.onStop(() => {
// Called when a Client stops its Subscription
});
});
// Later on…
// ==> Send a "new document" as a new signal message
sender.added(customCollectionName, 'someId', {
// "new document"
field: 'values2'
});
// CLIENT
const signalsCollectionName = 'collection-name'; // Must match what is used in Server
const Signals = new Mongo.Collection(signalsCollectionName);
Meteor.subscribe('custom-publication'); // As usual, must match what is used in Server
// Then use the Collection low-level API
// to listen to changes and act accordingly
// https://docs.meteor.com/api/collections.html#Mongo-Cursor-observe
const allSignalsCursor = Signals.find();
allSignalsCursor.observe({
added: (newDocument) => {
// Do your stuff with the received document.
}
});
Then how and when you use sender.added() is totally up to you.
Note: keep in mind that it will send data individually to a Client (each Client has their own Server session)
If you want to broadcast messages to several Clients simultaneously, then the easiest way is to use your MongoDB as the glue between your Server sessions. If you do not care about actual persistence, then simply re-use the same document over and over and listen to changes instead of additions in your Client Collection Cursor observer.
It's completly fine to use the database for such a task.
Maybe create a collection of "Streams" where you store the intended receiver and the message, the client subscribe to his stream and watches any changes on it.
You can then delete the stream from the database after the client is done with it.
This is a lot easier than reinventing the wheel and writing everything from scratch.

Webmatrix.Data.Database Connection String Cleared After Form Submit

I'm developing an ASP.NET (Razor v2) Web Site, and using the WebMatrix.Data library to connect to a remote DB. I have the Database wrapped in a singleton, because it seemed like a better idea than constantly opening and closing DB connections, implemented like so:
public class DB
{
private static DB sInstance = null;
private Database mDatabase = null;
public static DB Instance
{
get
{
if (sInstance == null)
{
sInstance = new DB();
}
return sInstance;
}
}
private DB()
{
mDatabase = Database.Open("<Connection String name from web.config>");
return;
}
<Query Functions Go Here>
}
("Database" here refers to the WebMatrix.Data.Database class)
The first time I load my page with the form on it and submit, a watch of mDatabase's Database.Connection property shows the following: (Sorry, not enough rep to post images yet.)
http://i.stack.imgur.com/jJ1RK.png
The form submits, the page reloads, the submitted data shows up, everything is a-ok. Then I enter new data and submit the form again, and here's the watch:
http://i.stack.imgur.com/Zorv0.png
The Connection has been closed and its Connection String blanked, despite not calling Database.Close() anywhere in my code. I have absolutely no idea what is causing this, has anyone seen it before?
I'm currently working around the problem by calling Database.Open() before and Database.Close() immediately after every query, which seems inefficient.
The Web Pages framework will ensure that connections opened via the Database helper class are closed and disposed when the current page has finished executing. This is by design. It is also why you rarely see connections explicitly closed in any Web Pages tutorial where the Database helper is used.
It is very rarely a good idea to have permanently opened connections in ASP.NET applications. It can cause memory leaks. When Close is called, the connection is not actually terminated by default. It is returned to a pool of connections that are kept alive by ADO.NET connection pooling. That way, the effort required to instantiate new connections is minimised but managed properly. So all you need to do is call Database.Open in each page. It's the recommended approach.

How to clear SQL session state for all users in ASP.NET

I use SQLServer SessionState mode to store session in my ASP.NET application. It stores certain objects that are serialized/deserialized every time they are used.
If I make any changes in code of the structure of those objects and I put a new version live, any logged user will get an error, as their session objects (the old version ones) do not match the structure that the new version expects while deserializing.
Is there a way to clear all sessions at once in DB so that all active sessions expire and users are forced to log in again (and therefore all session objects are created from scratch)?
Or... Is there any other way to solve this situation?
You may try using stored procedure in SQL Server to clear all the sessions:
CREATE PROCEDURE [dbo].[DeleteSessions]
AS
DELETE [ASPState].dbo.ASPStateTempSessions
RETURN 0
You can call Session.Abandon, or Clear for every user when they hit the invalid Session object.
You can also loop through the per-user Session collection, and clear the keys that can contain "old" objects. Maybe you have a login ticket and such that you don't want to clear.
foreach (string key in Session.Keys)
{
if (!key.Equals("login"))
{
Session.Remove(key);
}
}

Is profile data for current user retrieved just once

a) When current user accesses Profile object for the first time, does Asp.Net
retrieve a complete profile for that user or are profile properties retrieved one at the time as they are called?
b) In any case, is profile data for current user retrieved from DB each time it is called or is it retrieved just once and then saved for the duration of current request?
thanx
You don't specify if this is a Website Project or a Web Application Project; when it comes to profiles there is a big difference as they are not implemented out of the box for the Web Application Project template:
http://codersbarn.com/post/2008/07/10/ASPNET-PayPal-Subscriptions-IPN.aspx
http://leedumond.com/blog/asp-net-profiles-in-web-application-projects/
Have you actually implemented it yet or are you just in the planning stage? If the latter, then the above links should provide some valuable info. As regards the caching issue, I would go with Dave's advice.
If you want to save a database call, you can use a utility method to cache the profile or you can implement your own custom MembershipProvider which handles the caching.
The utility method is probably the simplest solution in this case unless you have other functionality you want to implement which would be served by implementing a custom MembershipProvider.
You can refer to this link for more details:
How can I access UserId in ASP.NET Membership without using Membership.GetUser()?
Here's an example of a utility method to do caching. You can pass a slidingMinutesToExpire to make it expire from the cache after some duration of time to avoid consuming excess server memory if you have many users.
public static void AddToCache(string key, Object value, int slidingMinutesToExpire)
{
if (slidingMinutesToExpire == 0)
{
HttpRuntime.Cache.Insert(key, value, null, System.Web.Caching.Cache.NoAbsoluteExpiration, System.Web.Caching.Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.NotRemovable, null);
}
else
{
HttpRuntime.Cache.Insert(key, value, null, System.Web.Caching.Cache.NoAbsoluteExpiration, TimeSpan.FromMinutes(slidingMinutesToExpire), System.Web.Caching.CacheItemPriority.NotRemovable, null);
}
}

Resources