Before I invest the time in modifying the SubSonic 3 source, I figured I ask to see if I'm missing something simple.
Is it possible to use the SubSonic 3 Repository with migrations on a SQLite In-Memory database? I couldn't find a way to force the DbDataProvider to keep the connection open so the In-Memory SQLite database doesn't vanish when the connection get's closed.
The unit test with the connection string I was trying is...
public class SQLite_InMemory_SimpleRepositoryTests
{
public class Job
{
public Guid JobId { get; set; }
public string JobName { get; set; }
}
[Fact]
public void SQLite_InMemory_SimpleRepo_CanStayOpen()
{
IDataProvider provider = ProviderFactory.GetProvider("Data Source=:memory:;Version=3;New=True;Pooling=True;Max Pool Size=1;", "System.Data.SQLite");
IRepository repository = new SimpleRepository(provider, SimpleRepositoryOptions.RunMigrations);
for (int i = 0; i < 10000; i++)
{
var job = new Job {JobId = Guid.NewGuid(), JobName = "Job_"+i};
repository.Add(job);
}
}
}
I tried setting the "Shared Connection" on the IDataProvider, but the connection still seemed to close.
If not, I'll update the SubSonic source, and submit the changes.
Thanks!
Interesting - no there's no way that I can think of to do this other than maybe creating a static IDataProvider but even then we close off the connection for doing things like executing scalar's etc.
I spose you could create such a thing by implementing the IDataProvider then setting things up as you need - all the execution goes through it. But this is making me wonder if we explicitly shut things down in the calling code - which would be bad design on my part... hmmm.\
Would love to have this feature...
Related
I'm learning how to use ASP.NET 5 (vNext). In an attempt to do this, I'm working on a basic application. In this application, I'm trying to connect to the database from a couple of POCOs (Customer, Order, etc.) using Dapper. If I understand things correctly, its expensive to create, connect to, and tear down connections to a database. If this is true, I'm trying to figure out the recommended way to share a connection across multiple objects.
Currently, I have the following:
public class Order
{
private IDbConnection _connection;
public void Save()
{
using (_connection = new SqlConnection("[MyConnectionString]")
{
_connection.Open();
_connection.Execute("[INSERTION SQL]");
}
}
public List<Order> FindByCustomerEmailAddress(string emailAddress)
{
using (_connection = new SqlConnection("[MyConnectionString]")
{
_connection.Open();
return _connection.Query<List<Order>>("SELECT o.* FROM Order o, Customer c WHERE o.CustomerId=c.CustomerId AND c.EmailAddress='" + emailAddress + "'" );
}
}
}
public class Customer
{
private IDbConnection _connection;
public void Save()
{
using (_connection = new SqlConnection("[MyConnectionString]")
{
_connection.Open();
_connection.Execute("[INSERTION SQL]");
}
}
public Customer FindByEmailAddress(string emailAddress)
{
using (_connection = new SqlConnection("[MyConnectionString]")
{
_connection.Open();
return _connection.Query<Customer>("SELECT * FROM Customer WHERE EmailAddress='" + emailAddress + "'" );
}
}
}
I thought about creating a Database class that looks like this:
public static class Database
{
private static IDbConnection Connection { get; set; }
public static IDbConnection GetConnection()
{
if (Connection == null)
{
Connection = new SqlConnection("[MyConnectionString]");
Connection.Open();
}
return Connection;
}
}
public class Order
{
public void Save()
{
var connection = Database.GetConnection();
connection.Execute("[INSERTION SQL]");
}
public List<Order> FindByCustomerEmailAddress(string emailAddress)
{
var connection = Database.GetConnection();
return connection.Query<List<Order>>("SELECT ...");
}
}
However, after thinking about this, I'm not sure if this a good strategy for managing a database connection. The use of static in this manner seems dangerous. Yet, it seems like someone has had to solve this issue. But, nothing I've seen is explained so I do not understand if it actually works. Can someone share with me what the recommended approach for managing database connections in an efficient manner is?
Thank you!
Opening and closing connection to a database server is indeed expensive. However, .NET implements connection pooling just for this reason, and it is on by default. You can modify the setting of how many connection it should keep open (I don't recall the default).
So, if your connection string is the same, .NET will reuse an open connection from the pool and use that. If different, it'll create a new one.
Your first code is correct in using "using" so when you're done, the dispose/close will give the connection back to the pool.
See more about this here; https://msdn.microsoft.com/en-us/library/8xx3tyca(v=vs.110).aspx
I am pretty new to NUnit (and automated testing in general). I have recently done some Ruby On Rails work and noticed that in my test suite, when I create objects (such as a new user) and commit them during course of the suite, they are never committed to the database so that I can run the test over and over and not worry about that user already existing.
I am now trying to accomplish the same thing in NUnit, but I am not quite sure how to go about doing it. Do I create a transaction in the Setup and Teardown blocks? Thanks.
Why would you talk to the database during unit-tests? This makes your unit-test to integration-tests by default. Instead, create wrappers for all database communication, and stub/mock it during unit-tests. Then you don't have to worry about database state before and after.
Now, if you are not willing to that level of refactoring: The problem with transactions is that you need an open connection. So, if your method targeted for testing handles all communication on its own, it is really difficult to inject a transaction that you can create at setup and roll back at teardown.
Maybe you can use this. It is ugly, but perhaps it can work for you:
namespace SqlServerHandling
{
[TestFixture]
public sealed class TestTransactionRollBacks
{
private string _connectionString = "Data Source = XXXDB; ; Initial Catalog = XXX; User Id = BLABLA; Password = BLABLA";
private SqlConnection _connection;
private SqlTransaction _transaction;
[SetUp]
public void SetUp()
{
_connection = new SqlConnection(_connectionString);
_transaction = _connection.BeginTransaction();
}
[TearDown]
public void TearDown()
{
_transaction.Rollback();
}
[Test]
public void Test()
{
Foo foo = new Foo(_connection);
object foo.Bar();
}
}
internal class Foo
{
private readonly SqlConnection _connection;
object someObject = new object();
public Foo(SqlConnection connection)
{
_connection = connection;
}
public object Bar()
{
//Do your Stuff
return someObject;
}
}
I agree with Morten's answer, but you might want to look at this very old MSDN Magazine article on the subject: Know Thy Code: Simplify Data Layer Unit Testing using Enterprise Services
I use SQLite for unit tests, using NHibenate. Even if you're not using NHibernate it should be possible to do. SQLite has an in memory mode, where you can create a database in memory and persist data there. It is fast, works well, and you can simply throw away and recreate the schema for each test or fixture as you see fit.
You can see the example from Ayende's blog for an overview of how its done. He is using NHibernate, but the concept should work with other ORM or a straight DAL as well.
I'm trying to do something slightly unusual... I currently have a SQLite database, accessed with NHibernate. This database is frequently uploaded to a server. I have a new requirement to create a new table for reporting purposes, which is expected to become pretty big. This table doesn't need to be uploaded to the server, so I'd like to put it in a separate database, and use ATTACH DATABASE to access it transparently from my main database.
The problem is I don't know how to do that with NHibernate... How can I tell NHibernate to attach the other database when it connects? I can't find any connection string parameter or NH configuration property allowing to do that... is it even possible?
An acceptable option would be to manually execute the ATTACH DATABASE command when the connection is open, but I don't know how to do that. When I build the NH session factory, it immediately tries to update the schema (hbm2ddl.auto = update), and I don't have an opportunity to do anything with the connection before that. So it will just try to create the new table on my main database, which of course is not what I want...
Has anybody ever done that before? How did you do it?
Thanks
EDIT: In case someone needs to do the same, here's my solution, inspired by Diego's answer
Connection provider:
public class AttachedDbConnectionProvider : DriverConnectionProvider
{
private string _attachedDbAlias;
private string _attachedDbFileName;
public override IDbConnection GetConnection()
{
var connection = base.GetConnection();
if (!string.IsNullOrEmpty(_attachedDbAlias) && !string.IsNullOrEmpty(_attachedDbFileName))
{
using (var attachCommand = connection.CreateCommand())
{
attachCommand.CommandText = string.Format(
"ATTACH DATABASE '{0}' AS {1}",
_attachedDbFileName.Replace("'", "''"),
_attachedDbAlias);
attachCommand.ExecuteNonQuery();
}
}
return connection;
}
public override void Configure(IDictionary<string, string> settings)
{
base.Configure(settings);
settings.TryGetValue("connection.attached_db_alias", out _attachedDbAlias);
settings.TryGetValue("connection.attached_db_filename", out _attachedDbFileName);
}
}
Configuration file:
<property name="connection.provider">MyApp.DataAccess.AttachedDbConnectionProvider, MyApp.DataAccess</property>
<property name="connection.attached_db_alias">reportdb</property>
<property name="connection.attached_db_filename">mydatabase.report.db</property>
Now, to map a class to a table in the attached database, I just need to specify "reportdb." in the mapping file
This might help...
public class MyConnectionProvider : DriverConnectionProvider
{
public override IDbConnection GetConnection()
{
var connection = base.GetConnection();
var attachCommand = connection.CreateCommand();
attachCommand.CommandText = "ATTACH DATABASE FOO";
attachCommand.ExecuteNonQuery();
return connection;
}
}
Config:
<property name="connection.provider">MyConnectionProvider, MyAssembly</property>
I moved to ASP.NET from PHP where the queries are run directly. So I always create Connection in the Page_Load Event, dispose it after I do all stuff needed, and access data with NpgsqlCommand. (Yes, I use Postgresql in my ASP.NET applications.)
After starting to learn ASP.NET MVC I was amazed how easy it is to access SQL with the LINQ to SQL thing. But... It works only with MS SQL. So my question is how to implement the same functionality in my applications? How to connect to databases easily?
I wrote my own wrapper classes for connecting to Postgresql. 1 class per a table.
This is a part of the Student class:
public class Student : User
{
private static NpgsqlConnection connection = null;
private const string TABLE_NAME = "students";
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public string Password { get; set; }
/// <summary>
/// Updates the student
/// </summary>
public void Update()
{
Connect();
Run(String.Format("UPDATE " + TABLE_NAME + " SET first_name='{0}', last_name='{1}', password='{2}' WHERE id={3}", FirstName, LastName, Password, Id));
connection.Dispose();
}
/// <summary>
/// Inserts a new student
/// </summary>
public void Insert()
{
Connect();
Run(String.Format("INSERT INTO " + TABLE_NAME + " (first_name, last_name, password) VALUES ('{0}', '{1}', '{2}')",FirstName, LastName, Password));
connection.Dispose();
}
private static void Run(string queryString)
{
NpgsqlCommand cmd = new NpgsqlCommand(queryString, connection);
cmd.ExecuteScalar();
cmd.Dispose();
}
private static void Connect()
{
connection = new NpgsqlConnection(String.Format("Server=localhost;Database=db;Uid=uid;Password=pass;pooling=false"));
connection.Open();
}
//....
So as you see the problem here is that with every INSERT, DELETE, UPDATE request I'm using Connect() method which connects to the database. I didn't realize how stupid it was before I had to wait for 10 minutes to have 500 rows inserted, as there were 500 connections to the database.
Using pooling while connecting does help, but still making the connection and making the server check the pool during every single query is stupid.
So I decided to move Connection property to a static DB class, and it didn't work either, because it's a really bad idea to store such objects as connections in a static class.
I really don't know what to do know. Yes, there's an option of manullay creating the connections in every Page_Load event and close them in the end like I'm doing it right now.
Student student = new Student { FirstName="Bob", LastName="Black" };
NpgsqlConnection connection = ... ;
student.Insert(connection);
But this code is pretty ugly. I will be really thankful to somebody who can hep me here.
I would not recommend this design. It is better to encapsulate each database call which means each call opens a new connection each time you need to do something on the db. This might sound inefficient if it were not for connection pooling. ASP.NET will reuse connections automatically for you in a pool. The problem in your design is that there is nothing that guarantees that the connection will be closed.
Thus, you should try something like
private static void Insert()
{
var sql = "Insert Into "....;
ExecuteActionQuery(sql);
}
private static void ExecuteActionQuery( string query )
{
using (var conn = new NpgsqlConnection(String.Format(connString))
{
conn.Open();
using ( var cmd = new NpgsqlCommand(query, connection) )
{
cmd.ExecuteNonQuery();
}
}
}
I typically make a few global functions that encapsulate the standard operations so that I need only pass a query and parameters and my method does the rest. In my example, my ExecuteActionQuery does not take parameters but this was for demonstration only.
Not really pertaining to your question but another solution, if you like linq to sql, you could try DBLinq which provides a Linq to SQL provider for Postgresql and others databases.
http://code.google.com/p/dblinq2007/
i am facing some problems in my project. when i try to update entity it gives me different type of errors.
i read from net. these errors are because
1 - I am getting Object of entity class from method which creates DataContext locally
and in update method id does not update because here another DataContext is created locally.
(even it does not throw any exception)
i found many articles related to this problem
1 - Adding timestamp column in table (does not effect in my project. i tried this)
one guy said that use SINGLE DataContext for everyone.
i did this by creating the following class
public class Factory
{
private static LinqDemoDbDataContext db = null;
public static LinqDemoDbDataContext DB
{
get
{
if (db == null)
db = new LinqDemoDbDataContext();
return db;
}
}
}
public static Student GetStudent(long id)
{
LinqDemoDbDataContext db = Factory.DB;
//LinqDemoDbDataContext db = new LinqDemoDbDataContext();
Student std = (from s in db.Students
where s.ID == id
select s).Single();
return std;
}
public static void UpdateStudent(long studentId, string name, string address)
{
Student std = GetStudent(studentId);
LinqDemoDbDataContext db = Factory.DB;
std.Name = name;
std.Address = address;
db.SubmitChanges();
}
in this case i want to update student details.
it solved my problem. but now the question is.
Is it good approach to use above technique in Web Based application???
Is it good approach to use above technique in Web Based application???
No. DataContext is not thread safe. You cannot share 1 DataContext among the different threads handling different requests safely.
Also - this pattern is called Singleton, not Factory