Our project is running on ASP.NET, we are using Entity Framework with LINQ (lambda syntax) and we need to prevent from inserting into table at same time. I tried to use ReaderWriterLock class, but it works only in one session (when opened more tabs in browser), but not in more different browsers. I also read about creating table with timestamps (not sure if it can solve our problem) or use transactions, but do not now exactly how to use it in web application with LINQ.
Can you tell me please how to handle this exclusive write access in ASP.NET?
The ReaderWriterLockSlim could be a good choice, but if you want that ANY thread/process may share the same lock, the whole ReaderWriterLockSlim must be a static member.
That is, your class should look like this:
public class Class1
{
private readonly static ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();
}
Important note
Using an application layer lock you'll be able to lock your own application threads in order to limit one thread to access the database at once. But other applications (not the ASP.NET one, or another application in another application pool also on IIS) may be able to access the database in parallel either doing reads and writes.
If you want a 100% effective solution, you must use database transactions. If SQL Server is the RDBMS, you can go for a transaction with Serializable isolation level:
Volatile data can be read but not modified, and no new data can be
added during the transaction.
Learn more here.
Related
I am in a situation where requirement is to keep an application level object in web api which can be accessed by all requests. I know one can use HttpContext.Current but that is not required since HttpContext is only for the liftime of request. I need a solution where i can keep an object that all requests can access and update as required.
Use a static class to hold your application level objects. static classes and static data members are created once for the application lifetime and all ASP.NET requests can access them.
I learnt it the hard way. Some time back, I mistakenly created a static field to hold customer-specific database connection string, in a ASP.NET Web API project and it became a mess. On each customer's login it was being set (overridden) in the code and the requests from the previously logged customers were using this newly set static SQL connection string for their queries. It was an embarrassing situation when customer's inadvertently saw each other's data.
You could use SessionState (per session).
I.e.
Session["YourDataKey"] = ApplicationLevelObject;
And then check the session state variable on each request that requires it.
However if you require the object for longer, I.e. every single user session, then I would suggest persisting your object to a database. You could use an ORM such as Entity Framework.
Cheers
I use Fluent NHibernate code to create a MySQL database SessionFactory. No config files (just one value for the connection string in configuration - connectionStrings section of configuration file).
The SessionFactory creation code is contained in a Data tier class: SessionFactoryManager, which implements a singleton internal SessionFactory which is used by the Data and Business tiers to get all the sessions via SessionFactoryManager.OpenSession().
Some of my Business tier methods internally call SessionFactoryManager.OpenSession() to create sessions in a way that is transparent to the Presentation layer. So, when calling this methods there is no parameter or return value involving a session (to keep the Presentation layer "session-agnostic" when using those Business tier methods).
My problem comes when I write the integration tests for the Business layer: I would like to make them run on a SQLite in-memory database. I create a SessionFactoryManager which uses Fluent configuration to configure the SQLite database.
But when testing those methods that internally create the session, I can not tell them to use my testing SessionFactory (configured to use SQLite). So the "real" SessionFactory is called, and so the MySql database is used, not the SQLite.
I'm thinking of several possible solutions, but none of them seems right.
I could migrate the NHibernate configuration in Data layer to config files, and make different NHibernate config files for development/production and test environments, but I really would prefer to keep on with Fluent code.
I could also modify my Data layer to use a single configuration value, databaseMode or similar, that sets the database to be used: testing in-memory or the real one. And write some switch(databaseMode) statements like "case inMemory: { ... fluent code for in-memory SQLite... } case standard: { ... fluent code for standard database ... }". I don't like this approach at all, I don't want to modify my Data tier code functionality just for testing purposes.
Notice that I'm not testing Data layer, but Business layer. Not interested in testing NHibernate mappings, Dao or similar functionality. I already have unit tests for that, running OK with SQLite database.
Also, changing database is not a requirement of my application, so I'm not quite interested in implementing significant changes that allow me to dynamically change the DBMS, I only came to this need in order to write the tests.
A significant point: when using in-memory SQLite the database connection must be the same for all new sessions, otherwise the database objects are not available to the new sessions. So when creating a new session with SessionFactory.OpenSession() a parameter "connection" must be provided. But this parameter should not be used with non in-memory database. So the switch(databaseMode) should be used for any single session creation! Another Data layer code change that I don't like at all.
I'm seriously considering giving up and running my tests with the real database, or at least on an empty one, with its objects created and dropped for any test execution. But with this the test execution will surely be slower. Any ideas? Thanks in advance.
Finally my solution was Inversion Of Control: I changed my data tier so I can inject a custom SessionFactoryBuilder class that makes the Fluently.Configure(...) magic.
In my data tier I use the "real" MySqlSessionFactoryBuilder, in my test projects I write TestMySqlFactoryBuilder or TestSQLiteSessionFactoryBuilder classes, or whatever I need.
I still have problems with SQLite feature that requires that the same connection is used for all sessions, and must be passed as a parameter in every ISession.Open() call. By the moment I have not modified my data tier to add that feature, but I would like to do it in the future. Probably by adding to my SessionFactory singleton a static private member to store the connection used to make SchemaExport, and a static private boolean member like PreserveConnection to state that this connection must be stored in that private member and used in every ISession.Open(). And also wrap ISession.Open() and make sure that no session is opened directly.
Developing web site (using Entity Framework) i have encountered in following questions:
1.What happens if a lot (lets say 10,000) people trying "to write" simultaneously to the same specific table in DB (SQL Server) via Entity Framework ?
2.In my project i have modules and for decoupling reasons i using singleton class (ModulesManager) which should take Action from each module and execute it asynchronous like following:
public void InsertNewRecord(Action addNewRecordAction)
{
if (addNewRecordAction != null)
{
addNewRecordAction.BeginInvoke(recordCallback, null);
}
}
Is it good approach to use singleton class as only place responsible to write to DB ?
3.Does Entity Framework can provide same speed as using SQL queries ?
What happens if a lot (lets say 10,000) people trying "to write"
simultaneously to the same specific table in DB (SQL Server) via
Entity Framework ?
If you mean inserting to the same table those insert will be processed based on transaction isolation level in the database. Usually only single transaction can hold a lock for insertion so inserts are processed in sequence (it has nothing to do with EF). Having 10.000 users concurrently inserting doesn't seem like sustainable architecture - some of them may timeout.
In my project i have modules and for decoupling reasons i using
singleton class (ModulesManager) which should take Action from each
module and execute it asynchronous like following:
Your manager asynchronously invokes the action so the answer is mostly dependent on what the action is doing. If it opens its own context, performs some changes and saves them, you should not have any technical problem on EF side.
Does Entity Framework can provide same speed as using SQL queries ?
No. EF does additional processing so it will always be slower.
We're using a Linq-to-SQL DataContext in a web application that provides read-only data to the application and is never updated (have set ObjectTrackingEnabled = false to enforce this.
Since the data never changes (except for occasional config updates) it seems wasteful to be reloading it from SQL Server with a new DataContext for each web request.
We tried caching the DataContext in the Application object for all requests to use, but it was generating a lot of error and our research since shows that this was a bad idea, DataContext should be disposed of within the same unit of work, not thread safe, etc, etc.
So since the DataContext is meant to be a data access mechanism, not a data store, we need to be looking at caching the data that we get from it, not the context itself.
Would prefer to do this with the entities and collections themselves so the code can be agnostic about whether it is dealing with cached or "fresh" data.
How can this be done safely?
First, I need to make sure that the entities and collections are fully loaded before I dispose of the DataContext. Is there a way to force a full load of everything from the database conveniently?
Second, I'm pretty sure that storing references to the entities and collections is a bad idea, because it will either
(a) cause the entities to be corrupted when the DataContext goes out of scope or
(b) prevent the DataContext from going out of scope
So should I clone the EntitySets and store them? If so, how? Or what's the go here?
This is not exactly an answer to your question, but I suggest avoiding caching on web site side.
I would rather focus on optimizing database queries for faster and more efficient data retrieval.
Caching will:
not be scalable
need extra code for synchronization, I assume your data isn't completely static in DB?
extra code will be bug prone
will eat up memory of your web server quickly, the next thing you might end up addressing is memory issue on your web server
will not work very well, when you need to load-balance your web site
[Edit]
If I needed to cache 5MB data, I would use Cache object, probably with lazy loading. I would use a set of lightweight collections, like ReadonlyCollection<T>, Collectino<T>. I would probably use ReadonlyDictionary<TKey, TValue> also for quick searches in the memory. I would use LINQ-to-Objects to manipulate with the collections.
You want to cache the data retrieved from the DataContext rather than the DataContext object itself. I usually refactor out commonly-retrieved data into methods that I can implement silent caching with, something like this (may need to add thread-safe logic):
public class MyBusinssLayer {
private List<MyType> _myTypeCache = null;
public static List<MyType> GetMyTypeList() {
if (_myTypeCache == null) {
_myTypeCache = // data retrieved from SQL server
}
return _myTypeCache
}
}
This is the simplest pattern that can be used and will cache for one web request. To cache for longer periods, store the contents in a longer-term storage, such as Application or Cache. For instance, to store in Application level data, use this kind of pattern.
public static List<MyType> GetMyTypeList() {
if (Application["MyTypeCacheName"] = null) {
Application["MyTypeCacheName"] = // data retrieved from SQL server
}
return (List<MyType>)Application["MyTypeCacheName"];
}
This would be for data that almost never changes, such as a static collection of status types to choose from in a DropDownList. For more volitile data, you can use the Cache with a timeout period, which should be selected based on how often the data changes. With the Cache items can be invalidated manually with code if necessary, or with a depedency checker like SqlCacheDependency.
Hope this helps!
I'm working with a project in ASP.Net using Webforms. I'm using Entity Framework to save data on Microsoft SQL.
My question is:
Is possible to use a Static class to keep the ObjectContext of EF live and put/get entities NOT saved inside the ObjectContext?
I want to create an Object, then added with AddObject on the ObjectContext, But NOT to do the Savechanges. All this in one webform. And then, in other webform, access to the ObjectContext and get the Object when added.
It is this possible?
My rules to using ObjectContext:
Do not use static context.
Do not share context.
You are trying to violate both rules. If you do that your application will have undeterministic behavior. Create new ObjectContext instance for each request. It is the same as openning new connection and starting new transaction in the request instead of sharing one connection and one transaction among all of them.
Further explanation also here. Also check linked question in right column and you will see what type of problems people have just because of violating one or both mentioned rules.
Also in web application it becames even more interesting because ObjectContext is not thread safe.
You could add it to the application items collection. See this blog post for syntax and such.
http://www.informit.com/articles/article.aspx?p=27315&seqNum=3
Generally, you don't want to. An ObjectContext is intended to be a unit of work, alive for a single set of related transactions. In an ASP.NET application, that generally corresponds to a single request.
If you must keep it alive for multiple requests, I wouldn't use either a static class, nor the application context. Instead, I'd recommend using the Cache, and then attaching the callbacks to it that let you ensure all your transactions are committed before it gets evicted, just in case.