Developing web site (using Entity Framework) i have encountered in following questions:
1.What happens if a lot (lets say 10,000) people trying "to write" simultaneously to the same specific table in DB (SQL Server) via Entity Framework ?
2.In my project i have modules and for decoupling reasons i using singleton class (ModulesManager) which should take Action from each module and execute it asynchronous like following:
public void InsertNewRecord(Action addNewRecordAction)
{
if (addNewRecordAction != null)
{
addNewRecordAction.BeginInvoke(recordCallback, null);
}
}
Is it good approach to use singleton class as only place responsible to write to DB ?
3.Does Entity Framework can provide same speed as using SQL queries ?
What happens if a lot (lets say 10,000) people trying "to write"
simultaneously to the same specific table in DB (SQL Server) via
Entity Framework ?
If you mean inserting to the same table those insert will be processed based on transaction isolation level in the database. Usually only single transaction can hold a lock for insertion so inserts are processed in sequence (it has nothing to do with EF). Having 10.000 users concurrently inserting doesn't seem like sustainable architecture - some of them may timeout.
In my project i have modules and for decoupling reasons i using
singleton class (ModulesManager) which should take Action from each
module and execute it asynchronous like following:
Your manager asynchronously invokes the action so the answer is mostly dependent on what the action is doing. If it opens its own context, performs some changes and saves them, you should not have any technical problem on EF side.
Does Entity Framework can provide same speed as using SQL queries ?
No. EF does additional processing so it will always be slower.
Related
I would like to understand an Axon feature.
Currently, we are developing an application using microservice architecture.
We want to store all service events in a central RDBMS database, like for example PostgreSQL.
Is it possible to use such a store?
We have used the below configuration to store events in same domain DB:
#Bean
public AggregateFactory<UserAggregate> userAggregateFactory() {
SpringPrototypeAggregateFactory<UserAggregate> aggregateFactory =
new SpringPrototypeAggregateFactory<>();
aggregateFactory.setPrototypeBeanName("userAggregate");
return aggregateFactory;
}
Now we want to store events in a central Event Store DB, not with domain DB.
Firstly, the AggregateFactory within any Axon application does not define where or how your events are stored at all.
I instead suggest to read the Event Bus & Event Store section of the Axon Framework reference guide on the matter to explain how you can achieve this.
The short answer to your question is by the way yes, you can have a single Event Store backed by a RDBMS, like PostgreSQL, to store all your events in.
Between duplicated instances of a given application it is actually highly recommended to use the same storage location.
As soon as you are going to span different Bounded Context's, I would suggest to define different Event Stores per context though.
Concluding, you are using an old version of Axon Framework.
I would highly recommend to move the at least the latest Axon 3 release, being 3.4.3, but ideally you start using 4.1.2.
Note that there is no active development taking place on Axon 3 any more, hence the suggestion.
I've started using EF Core for the 1st time with asp.net core and I must confess that I'm not finding the experience that good (specially if you're coming from NH). Now, I've got the following scenario:
2 different DbContexts and each context uses different schemas
the API has been set up so that some API operation are wrapped by a transaction (in practice, everything has been set up so that there's a single sqlconnection that's shared by both contexts)
I'd like to test these kind of methods, but I'm not sure on the best approach...After reading the docs, it looks like the best option is using SQLite, but there's a small gotach: setting up the db.
The docs say that I can use the context.Database.EnsureCreated method. Unfortunately, that will only work if the in memory db hasn't been created yet. In other words, after calling if from the 1st context instance, it won't do anything when called over the 2nd context instance (because both context share the same db and it has already been created after the 1st call). In practice, this means that I'll end up with a partial db that has the tables mapped to the entities of the 1st context.
Is there a way to force the creation of the 2nd context tables? Ie, can I write something like this with EF Core:
contextA.Database.EnsureCreated();
contextB.Database.JustCreateTheTablesPlease();
Or do I need to recreate my db from a SQL script before running my tests?
Thanks.
I use Fluent NHibernate code to create a MySQL database SessionFactory. No config files (just one value for the connection string in configuration - connectionStrings section of configuration file).
The SessionFactory creation code is contained in a Data tier class: SessionFactoryManager, which implements a singleton internal SessionFactory which is used by the Data and Business tiers to get all the sessions via SessionFactoryManager.OpenSession().
Some of my Business tier methods internally call SessionFactoryManager.OpenSession() to create sessions in a way that is transparent to the Presentation layer. So, when calling this methods there is no parameter or return value involving a session (to keep the Presentation layer "session-agnostic" when using those Business tier methods).
My problem comes when I write the integration tests for the Business layer: I would like to make them run on a SQLite in-memory database. I create a SessionFactoryManager which uses Fluent configuration to configure the SQLite database.
But when testing those methods that internally create the session, I can not tell them to use my testing SessionFactory (configured to use SQLite). So the "real" SessionFactory is called, and so the MySql database is used, not the SQLite.
I'm thinking of several possible solutions, but none of them seems right.
I could migrate the NHibernate configuration in Data layer to config files, and make different NHibernate config files for development/production and test environments, but I really would prefer to keep on with Fluent code.
I could also modify my Data layer to use a single configuration value, databaseMode or similar, that sets the database to be used: testing in-memory or the real one. And write some switch(databaseMode) statements like "case inMemory: { ... fluent code for in-memory SQLite... } case standard: { ... fluent code for standard database ... }". I don't like this approach at all, I don't want to modify my Data tier code functionality just for testing purposes.
Notice that I'm not testing Data layer, but Business layer. Not interested in testing NHibernate mappings, Dao or similar functionality. I already have unit tests for that, running OK with SQLite database.
Also, changing database is not a requirement of my application, so I'm not quite interested in implementing significant changes that allow me to dynamically change the DBMS, I only came to this need in order to write the tests.
A significant point: when using in-memory SQLite the database connection must be the same for all new sessions, otherwise the database objects are not available to the new sessions. So when creating a new session with SessionFactory.OpenSession() a parameter "connection" must be provided. But this parameter should not be used with non in-memory database. So the switch(databaseMode) should be used for any single session creation! Another Data layer code change that I don't like at all.
I'm seriously considering giving up and running my tests with the real database, or at least on an empty one, with its objects created and dropped for any test execution. But with this the test execution will surely be slower. Any ideas? Thanks in advance.
Finally my solution was Inversion Of Control: I changed my data tier so I can inject a custom SessionFactoryBuilder class that makes the Fluently.Configure(...) magic.
In my data tier I use the "real" MySqlSessionFactoryBuilder, in my test projects I write TestMySqlFactoryBuilder or TestSQLiteSessionFactoryBuilder classes, or whatever I need.
I still have problems with SQLite feature that requires that the same connection is used for all sessions, and must be passed as a parameter in every ISession.Open() call. By the moment I have not modified my data tier to add that feature, but I would like to do it in the future. Probably by adding to my SessionFactory singleton a static private member to store the connection used to make SchemaExport, and a static private boolean member like PreserveConnection to state that this connection must be stored in that private member and used in every ISession.Open(). And also wrap ISession.Open() and make sure that no session is opened directly.
Our project is running on ASP.NET, we are using Entity Framework with LINQ (lambda syntax) and we need to prevent from inserting into table at same time. I tried to use ReaderWriterLock class, but it works only in one session (when opened more tabs in browser), but not in more different browsers. I also read about creating table with timestamps (not sure if it can solve our problem) or use transactions, but do not now exactly how to use it in web application with LINQ.
Can you tell me please how to handle this exclusive write access in ASP.NET?
The ReaderWriterLockSlim could be a good choice, but if you want that ANY thread/process may share the same lock, the whole ReaderWriterLockSlim must be a static member.
That is, your class should look like this:
public class Class1
{
private readonly static ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();
}
Important note
Using an application layer lock you'll be able to lock your own application threads in order to limit one thread to access the database at once. But other applications (not the ASP.NET one, or another application in another application pool also on IIS) may be able to access the database in parallel either doing reads and writes.
If you want a 100% effective solution, you must use database transactions. If SQL Server is the RDBMS, you can go for a transaction with Serializable isolation level:
Volatile data can be read but not modified, and no new data can be
added during the transaction.
Learn more here.
I'm evaluating some technologies for a new Web Application. Which should use EF5 and Knockout JS with Web API. I wanted to take advantage of the OData feature, when returning IQueryable, but am currently running into the problem, how to convert my EF Models to my Business Models.
As far as I've read, if I want to have a more complex DB (Computed Columns, Stored Procedures, ...) I should use DB First approach. (Correct me if I'm wrong)
Because I need to use DB-First approach and want my models to be Independent of the DB, I need to create them additionally to the EF-Models. And when I return from the DataLayer my Business Model as IQueryable I loose the possibility to execute additional queries directly on the DB but instead they are executed on the ASP.Net server directly.
Of course I don't plan to run complex queries over OData and would anyway implement those as additional actions, but it might be useful on the slower clients (smartphones, ...) to limit the returned data and perform additional filters directly on the server.
Is there any way out of this dilemma, to be still able to use OData?
Regards
Peter
You can try using Code First and EF migrations to create/upgrade database. With migrations you can create custom migrations that can be just SQL scripts to achieve what can't be done automatically with Code First itself. Database First approach is fine as well.
Ask yourself if you really want to/need to support multiple backends. It is possible with EF but hard to maintain. In this case I assume your conceptual model (csdl) would be the same for all databases but you would have multiple store specific models (ssdl files). Since your model would be the same for all databases you would have the same C# types regardless of the database you are using.
When supporting multiple databases you won't be able to run SQL queries against the database (or to be more specific you will get exceptions if you run SQL query specific to one database against a different database) but ideally you should not need it. In the worst you could enclose the logic you would like to code in SQL in a stored procedure that would exist in all databases. Again, I don't know when this would be needed (the only thing that comes to mind is performance) but since you are planning using OData you wouldn't be able to run these queries anyways unless you start using Service Operations.
Since your conceptual model would be the same regardless of the database you would have the same types regardless of the database. You could try using these for both DataLayer and Business Model (especially if you go with POCO). An alternative would be to use POCO/DTOs. (I have not tried OData support in Web API but with WCF Data Services the service itself would actually use EF types so you would not be even able to tell the service to use different set of types).
You actually don't lose the ability with DB first models to execute queries against the business model, as long as your transforms aren't too bad. For example, I have a OData service which has a PersistedCustomer (DB model) and a Customer (Business model). With EF5, I can write the LINQ which transforms the IQueryable to IQueryable, query against the IQueryable and EF can translate the criteria back against the original database.