I'm trying to do something slightly unusual... I currently have a SQLite database, accessed with NHibernate. This database is frequently uploaded to a server. I have a new requirement to create a new table for reporting purposes, which is expected to become pretty big. This table doesn't need to be uploaded to the server, so I'd like to put it in a separate database, and use ATTACH DATABASE to access it transparently from my main database.
The problem is I don't know how to do that with NHibernate... How can I tell NHibernate to attach the other database when it connects? I can't find any connection string parameter or NH configuration property allowing to do that... is it even possible?
An acceptable option would be to manually execute the ATTACH DATABASE command when the connection is open, but I don't know how to do that. When I build the NH session factory, it immediately tries to update the schema (hbm2ddl.auto = update), and I don't have an opportunity to do anything with the connection before that. So it will just try to create the new table on my main database, which of course is not what I want...
Has anybody ever done that before? How did you do it?
Thanks
EDIT: In case someone needs to do the same, here's my solution, inspired by Diego's answer
Connection provider:
public class AttachedDbConnectionProvider : DriverConnectionProvider
{
private string _attachedDbAlias;
private string _attachedDbFileName;
public override IDbConnection GetConnection()
{
var connection = base.GetConnection();
if (!string.IsNullOrEmpty(_attachedDbAlias) && !string.IsNullOrEmpty(_attachedDbFileName))
{
using (var attachCommand = connection.CreateCommand())
{
attachCommand.CommandText = string.Format(
"ATTACH DATABASE '{0}' AS {1}",
_attachedDbFileName.Replace("'", "''"),
_attachedDbAlias);
attachCommand.ExecuteNonQuery();
}
}
return connection;
}
public override void Configure(IDictionary<string, string> settings)
{
base.Configure(settings);
settings.TryGetValue("connection.attached_db_alias", out _attachedDbAlias);
settings.TryGetValue("connection.attached_db_filename", out _attachedDbFileName);
}
}
Configuration file:
<property name="connection.provider">MyApp.DataAccess.AttachedDbConnectionProvider, MyApp.DataAccess</property>
<property name="connection.attached_db_alias">reportdb</property>
<property name="connection.attached_db_filename">mydatabase.report.db</property>
Now, to map a class to a table in the attached database, I just need to specify "reportdb." in the mapping file
This might help...
public class MyConnectionProvider : DriverConnectionProvider
{
public override IDbConnection GetConnection()
{
var connection = base.GetConnection();
var attachCommand = connection.CreateCommand();
attachCommand.CommandText = "ATTACH DATABASE FOO";
attachCommand.ExecuteNonQuery();
return connection;
}
}
Config:
<property name="connection.provider">MyConnectionProvider, MyAssembly</property>
Related
I am using SQLite database in my universal app. I want to make password protection for the DB file. I am able to set the password for the db file. But when I am trying to read it it shows error like "Sqlite26: file is encrypted or not a database file".
I referred to this URL. I am using Entity Framework Core in the .NET Standard library. Is it possible to read the value from the encrypted DB in .NET Standard library?
The author of the article you linked actually has an answer here on SO that deals with the EF Core scenario as well. You can open the database with EF Core in the OnConfiguring method override:
class AppDbContext : DbContext
{
private SqliteConnection _connection;
protected override void OnConfiguring(DbContextOptionsBuilder options)
{
_connection = new SqliteConnection(_connectionString);
_connection.Open();
var command = _connection.CreateCommand();
command.CommandText = "PRAGMA key = 'password';";
command.ExecuteNonQuery();
options.UseSqlite(_connection);
}
protected override void Dispose()
{
_connection?.Dispose();
}
}
The kez is to provide the password as in a PRAGMA command before the first query on the connection.
My Controller code
ParaEntities db = new ParaEntities();
public List<Client> GetAllClients()
{
return db.Client.ToList();
}
Please click this link to see the error message
It is weird that when I am first time to click the button to get all client information then it responses 500. In the second time, I click the button to get all client, which is success.
You should assign variable and display the data in View.
Please change the syntax as i write below.
ParaEntities db = new ParaEntities();
public List<Client> GetAllClients()
{
var getData= db.Client.ToList();
if(getData==null)
{
return null;
}
return getData;
}
This error points to a connection problem rather then code issue. Check that the connectionstring is valid and that the user specified in the connectionstring has access to the database. If you're running the application on IIS then make sure that the applicationpool user has access to the database. Here is another SO issue were they solved this error.
If you want to store the db context as a local variable in your controller class then I suggest you to instantiate it inside of the controllers constructor. Then you make sure that every time a instance of the controller is created then a new db context is created as well.
Lets say your controller namned ClientController
private ParaEntities db;
public ClientController()
{
this.db = new ParaEntities();
}
public List<Client> GetAllClients()
{
return db.Client.ToList();
}
Another approach is to wrap your db context in a using statment inside of your method. In that case you make sure that the method is using a fresh context when being called upon and that the context is being disposed when the operation is completed.
public List<Client> GetAllClients()
{
using(ParaEntities db = new ParaEntities())
{
return db.Client.ToList();
}
}
PS: both examples violates the dependency inversion principle (hard coupling to the db context) but thats for another day
Please try this
public List<Client> GetAllClients()
{
ParaEntities db = new ParaEntities();
return db.Client.ToList();
}
I'm in the process of making live a .net system. I am using LINQ and MVC. I had to create the live database and hoped that this would run smoothly however it did not.
New SQL SERVER - Microsoft Windows NT 5.0 (2195) / 8.00.760
I created an administrator user who can add/edit/delete. Basically I receive the following error if I try to add (.InsertOnSubmit) or delete (.DeleteOnSubmit) any rows however not when I edit.
"Network access for Distributed Transaction Manager (MSDTC) has been disabled. Please enable DTC for network access in the security configuration for MSDTC using the Component Services Administrative tool."
I've Googled this error and found people believe its to do with "MSDTC service" however this seems to be ticked on the service. I have logged into the database by SQL Server Management and this user can add/delete.
One Example:
Controller
if (_service.AddAccess(access)) return RedirectToAction("Access");
public Repository()
{
_db = new DataClassDataContext(ConfigurationManager.ConnectionStrings["RegistrarsServices"].ConnectionString);
}
public void Save()
{
_db.SubmitChanges();
}
public bool AddAccess(Access access)
{
try
{
using (var scope = new TransactionScope())
{
_db.Accesses.InsertOnSubmit(access);
Save();
scope.Complete();
}
return true;
}
catch (Exception)
{
return false;
}
}
*Please note this did work when using the development server. Microsoft Windows NT 5.2 (3790) / 10.0.1600.22
Controller
private readonly ServiceAdministration _service = new ServiceAdministration();
public ActionResult AddAccess()
{
return View(new EmailViewModel());
}
[HttpPost]
public ActionResult AddAccess(EmailViewModel emailViewModel)
{
if (emailViewModel.Button == "Back") return RedirectToAction("Access");
if (!ModelState.IsValid) return View(emailViewModel);
Access access = new Access();
access.emailAddress = emailViewModel.emailAddress;
if (_service.AddAccess(access)) return RedirectToAction("Access");
emailViewModel.errorMessage = "An error has occurred whilst trying to grant access. Please try again later.";
return View(emailViewModel);
}
Service
readonly Repository _repository = new Repository();
public bool AddAccess(Access access)
{
return _repository.AddAccess(access);
}
Don't really know what I am missing.
Thanks in advance for any help.
Clare :-)
Do you have any idea why are you triggering distributed transactions? This is what you need to investigate. Usual culprit is multiple ADO.Net connection from a single TransactionScope. See ADO.NET and System.Transactions and ADO.NET and LINQ to SQL. Make sure you use a single connection (ie. LINQ2SQL context) in a transaction scope. You shouldn't have to use more than one per HTTP call anyway.
I moved to ASP.NET from PHP where the queries are run directly. So I always create Connection in the Page_Load Event, dispose it after I do all stuff needed, and access data with NpgsqlCommand. (Yes, I use Postgresql in my ASP.NET applications.)
After starting to learn ASP.NET MVC I was amazed how easy it is to access SQL with the LINQ to SQL thing. But... It works only with MS SQL. So my question is how to implement the same functionality in my applications? How to connect to databases easily?
I wrote my own wrapper classes for connecting to Postgresql. 1 class per a table.
This is a part of the Student class:
public class Student : User
{
private static NpgsqlConnection connection = null;
private const string TABLE_NAME = "students";
public int Id { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public string Password { get; set; }
/// <summary>
/// Updates the student
/// </summary>
public void Update()
{
Connect();
Run(String.Format("UPDATE " + TABLE_NAME + " SET first_name='{0}', last_name='{1}', password='{2}' WHERE id={3}", FirstName, LastName, Password, Id));
connection.Dispose();
}
/// <summary>
/// Inserts a new student
/// </summary>
public void Insert()
{
Connect();
Run(String.Format("INSERT INTO " + TABLE_NAME + " (first_name, last_name, password) VALUES ('{0}', '{1}', '{2}')",FirstName, LastName, Password));
connection.Dispose();
}
private static void Run(string queryString)
{
NpgsqlCommand cmd = new NpgsqlCommand(queryString, connection);
cmd.ExecuteScalar();
cmd.Dispose();
}
private static void Connect()
{
connection = new NpgsqlConnection(String.Format("Server=localhost;Database=db;Uid=uid;Password=pass;pooling=false"));
connection.Open();
}
//....
So as you see the problem here is that with every INSERT, DELETE, UPDATE request I'm using Connect() method which connects to the database. I didn't realize how stupid it was before I had to wait for 10 minutes to have 500 rows inserted, as there were 500 connections to the database.
Using pooling while connecting does help, but still making the connection and making the server check the pool during every single query is stupid.
So I decided to move Connection property to a static DB class, and it didn't work either, because it's a really bad idea to store such objects as connections in a static class.
I really don't know what to do know. Yes, there's an option of manullay creating the connections in every Page_Load event and close them in the end like I'm doing it right now.
Student student = new Student { FirstName="Bob", LastName="Black" };
NpgsqlConnection connection = ... ;
student.Insert(connection);
But this code is pretty ugly. I will be really thankful to somebody who can hep me here.
I would not recommend this design. It is better to encapsulate each database call which means each call opens a new connection each time you need to do something on the db. This might sound inefficient if it were not for connection pooling. ASP.NET will reuse connections automatically for you in a pool. The problem in your design is that there is nothing that guarantees that the connection will be closed.
Thus, you should try something like
private static void Insert()
{
var sql = "Insert Into "....;
ExecuteActionQuery(sql);
}
private static void ExecuteActionQuery( string query )
{
using (var conn = new NpgsqlConnection(String.Format(connString))
{
conn.Open();
using ( var cmd = new NpgsqlCommand(query, connection) )
{
cmd.ExecuteNonQuery();
}
}
}
I typically make a few global functions that encapsulate the standard operations so that I need only pass a query and parameters and my method does the rest. In my example, my ExecuteActionQuery does not take parameters but this was for demonstration only.
Not really pertaining to your question but another solution, if you like linq to sql, you could try DBLinq which provides a Linq to SQL provider for Postgresql and others databases.
http://code.google.com/p/dblinq2007/
Before I invest the time in modifying the SubSonic 3 source, I figured I ask to see if I'm missing something simple.
Is it possible to use the SubSonic 3 Repository with migrations on a SQLite In-Memory database? I couldn't find a way to force the DbDataProvider to keep the connection open so the In-Memory SQLite database doesn't vanish when the connection get's closed.
The unit test with the connection string I was trying is...
public class SQLite_InMemory_SimpleRepositoryTests
{
public class Job
{
public Guid JobId { get; set; }
public string JobName { get; set; }
}
[Fact]
public void SQLite_InMemory_SimpleRepo_CanStayOpen()
{
IDataProvider provider = ProviderFactory.GetProvider("Data Source=:memory:;Version=3;New=True;Pooling=True;Max Pool Size=1;", "System.Data.SQLite");
IRepository repository = new SimpleRepository(provider, SimpleRepositoryOptions.RunMigrations);
for (int i = 0; i < 10000; i++)
{
var job = new Job {JobId = Guid.NewGuid(), JobName = "Job_"+i};
repository.Add(job);
}
}
}
I tried setting the "Shared Connection" on the IDataProvider, but the connection still seemed to close.
If not, I'll update the SubSonic source, and submit the changes.
Thanks!
Interesting - no there's no way that I can think of to do this other than maybe creating a static IDataProvider but even then we close off the connection for doing things like executing scalar's etc.
I spose you could create such a thing by implementing the IDataProvider then setting things up as you need - all the execution goes through it. But this is making me wonder if we explicitly shut things down in the calling code - which would be bad design on my part... hmmm.\
Would love to have this feature...