I read that when working with SQLite local database in Flex (no other framework) it is considered best practice to have a singleton data access object class.
While the singleton part is clear, I am not sure what the DAO part would mean in a Flex app.
So, how do you create a data access object in Flex (for SQLite transactions)?
EDIT:
What I currently do is something like this:
public class Database
{
//constructor, instantiation of sqlConnection and others
//...
public function getPeople():Array
{
var query:SQLStatement=new SQLStatement();
query.sqlConnection=sqlConnection;//Defined in the constructor as static
query.text='select * from people order by name asc';
try
{
query.execute();
return query.getResult().data;
}
catch(e:SQLError){}
return null;
}
}
Thank you.
Related
I want to write unit test for a class that contains linq to sql codes . I mean inside each method I have created a new DbContext and done database jobs .
I searched the web . first I came to use repository and Unit of Work patterns but I figured out that DbContext itself is a unit of work and its dbset works as repositories . another point is that I think there is no need to test Linq part because it works as it should ( tested by .net team ) . I want to test the logic I have added to the code . so I decided to create an interface with necessary methods with two implementations , one uses linqToSql while another is just a mock . something like this :
public interface IDbManager
{
bool Insert(MyEntity newEntity);
}
public class RealDbManager:IDbManager
{
public bool Insert(MyEntity newEntity)
{
using (DbDataContext db = new DbDataContext())
{
db.MyEntities.InsertOnSubmit(newEntity);
db.SubmitChanges();
}
}
}
public class MockDbManager:IDbManager
{
public bool Insert(MyEntity newEntity)
{
return true;
}
}
is the whole idea correct ? if so is this a correct implementation ?
is it possible to define DbDataContext as a class variable instead of creating new instance inside each method ?
You have the right general idea for a start. Your Mock Insert method should save the entity to some in-memory store so that subsequent queries will return the inserted information, as would be expected. But the very basic idea of having an interface, with a 'real' and a 'mock' implementation is there.
Remember that when using your Mock in tests, you are testing your other code that uses the mock - not the mock itself.
As for defining the DataContext as a member variable; you could use an IDisposable pattern for it, like so:
public class RealDbManager:IDbManager, IDisposable
{
DbDataContext db = new DbDataContext();
public bool Insert(MyEntity newEntity)
{
{
db.MyEntities.InsertOnSubmit(newEntity);
db.SubmitChanges();
}
}
public void Dispose()
{
db.Dispose();
}
}
You would just have to be sure to dispose of your DbManager, then.
Yes. The only thing I would avoid is to create an actual mocked class (in this case it should be called Fake), but using a mocking engine.
In your question you mention two kind of tests. First is testing the behavior of your class, the second is testing the integration of it. They seem the same but it's not.
In the first you need to mock your class to test its 'connection' against your other classes this way (using Moq):
[Test]
public void Test()
{
var entity = new Entity();
var mocked = new Mock<IDbManager>();
//you are telling the moq engine everytimes it finds an invocation of your repository
//to return true as you did in you mocked class
mocked.Setup( x => x.Insert( entity ) ).Returns( true );
var classUnderTest = new ClassUnderTest( mocked.Object );
//in this method you invoke your repository
var ret = classUnderTest.DoSomething( entity );
//assertions
Assert.Equal( something, ret);
//eventually you can verify that your repository has been hit once
mocked.Verify( x => x.Insert( It.IsAny<Entity>), Times.Once);
}
in the later as you correctly state, you have nothing to test on linq (Microsoft did it for us), but in case you need to verify the correctness of your linq you can do it only against a real db (or using a repository pattern against a fake repository). This is an integration test and it's has nothing to share with mocking.
To decouple your class from DbContext you could use repository pattern. Have a look at this article. http://dotnetspeak.com/index.php/2011/03/repository-pattern-with-entity-framework/
Our Asp.net web application is using LINQ-to-SQL (Stored Procs are dragged on dropped on dbml file to create classes) and 3 tier architecture is similar to the one below. I have just created rough methods to give reader proper idea so that he can answer well.
namespace MyDataLayer
{
public class MyDataAccess
{
// global instance of datacontext
MyDataModelDataContext myDB = new MyDataModelDataContext(); (#1)
public void GetUserIDByUsername(string sUserName, ref int iUserID)
{
int? iUserIDout = 0;
// this will make call to SP in SQL DB
myDB.USP_RP_GETUSERIDBYUSERNAME(sUserName, "", ref iUserIDout);
iUserID = (int)iUserIDout;
}
public List<USP_APP_USERDETAILSResult> GetUserDetails(string sUserIDs)
{
// this will make call to SP in SQL DB
return myDB.USP_APP_USERDETAILS(sUserIDs).ToList();
}
...
... // several CRUD methods
}
}
namespace MyBusinessLayer
{
public class SiteUser
{
// global DataAccess instance
MyDataLayer.MyDataAccess myDA = new MyDataAccess(); (#2)
public void GetUserIDByUsername(string sUserName, ref int iUserID)
{
myDA.GetUserIDByUsername(sUserName, ref iUserID);
}
public List<USP_APP_USERDETAILSResult> GetUserDetails(string sUserIDs)
{
// this will make call to SP in SQL DB
return myDA.GetUserDetails(sUserIDs);
}
...
... // several CRUD methods
}
}
namespace MyWebApplication
{
public class BaseWebPage : System.Web.UI.Page
{
// static business layer instance
public static MyBusinessLayer.SiteUser UserBLInstance = new SiteUser(); (#3)
...
}
}
// Index.aspx.cs code fragment
namespace MyWebApplication
{
public class Index : BaseWebPage
{
public void PopulateUserDropDown()
{
// using static business layer instance declared in BaseWebPage
List<USP_APP_USERDETAILSResult> listUsers = UserBLInstance.GetUserDetails("1,2,3");
// do databinding and so on ...
}
...
}
}
Questions
(Ref.#1) Is having global datacontext in DataAccess good approach? yes/no why?
If your suggestion is having datacontext per request what is the best practice for that
(Ref.#2) Is having global DataAccess instance in BusinessLayer good approach? yes/no why?
If your suggestion is having DataAccess instance per request what is the best practice for that
(Ref. #3) Is static business layer instance declared in BaseWebPage good approach? yes/no why?
Best approach to manage life time of BL instance and DL instance in general
We are facing periodic InvalidCastException on production server for a very simple method which works fine if I restart my application from IIS. When this problem is there we can access the same database from SQL Management Studio and can execute same SP
Our prime suspect about this issue is poor DataContext management and I have read many articles on net about managing life time of DataContext but I am now confused about various approach.
That's why I have elaborated my questions so that many in same situation can get clear idea about problem/answer.
(Ref.#1) Is having global datacontext in DataAccess good approach? yes/no why?
Yes.
However, creating it manually inside the dataaccess class means that you can't control the lifetime of the datacontext. Instead, make it then a constructor parameter so that it is injected into the data access
(Ref.#2) Is having global DataAccess instance in BusinessLayer good approach? yes/no why?
Yes. But refer to 1. - make it injectable via the constructor.
(Ref. #3) Is static business layer instance declared in BaseWebPage good approach? yes/no why?
No. Avoid static for complex objects as usually such objects has non-trivial state. And this is when a lot of nasty issues can happen if you share such objects in a concurrent environment.
To summarize.
public class DataAccess {
public DataAccess( DataContext context ) { ... }
}
public class BusinessLayer {
public BusinessLayer( DataAccess access ) { ... }
}
public class MyPage : Page {
...
var ctx = TheDataContext.Current;
var bl = new BusinessLayer( new DataAccess( ctx ) );
}
with data context shared in a request scope:
public partial class TheDataContext {
// Allow the datacontext to be shared in a request-scope
public static TheDataContext Current {
get {
if ( HttpContext.Current.Items["context"] == null )
HttpContext.Current.Items.Add( "context", new TheDataContext() );
return (TheDataContext)HttpContext.Current.Items["context"];
}
}
}
In your sample - your MyDataLayer usually has name Repository. Definitely it is good to have DataContext instance in Repositories and do not try to use them outside. So, only in repositories you will have dependency on Linq-To-Sql, which means that you can create Stub objects for these Repositories and really easy test other parts of your application.
Definitely you should Dispose your Data Context instances, DataContext contains too many objects to keep them alive and let GC to kill them. As you can see you don't create any transaction objects when you are working with DataContextes, so I think that LinqToSql based on idea that you should have everything per transaction (of course you can also try to handle transaction manually, but do you really want to do this?). Disposing datacontextes in methods of Repository is a good approach, because this will not allow you to use cool feature of all ORM frameworks: Lazy Load. If you will try to use Lazy Load - you will like it, but usually it is just one of possible performance degradation cause.
Definitely your should use DataContextes for shorter or the same time of Request, don't try to use LongSession (it is when you trying to keep DataContext for more than one Http Request, it is just pain in ass, nothing else, if you want to read about this, just try to read couple articles about Long Running Session in Hibernate, I tried with nHibernate - don't do this at home ;) ).
I have an ASP.NET website project that until recent had all code in App_Code folder. It uses Entity Framework 4 as ORM. Application is divided into three "sections" (let's say one for each customer). Each section has it's own database (but same schema). This is due to performance reasons, databases are over 10GB each with millions of rows.
Each time a context object is created a Session variable which holds section ID is called and proprietary connection string is chosen for this context.
It looks like this (following are members of static Connection class):
public static MyEntities GetEntityContext()
{
if (HttpContext.Current.Session["section"] == null)
{
HttpContext.Current.Response.Redirect("~/Login.aspx");
}
var context = new MyEntities(GetEntityConnectionStringForSection((int)HttpContext.Current.Session["section"]);
return context;
}
private static string GetEntityConnectionStringForSection(int section)
{
switch (section)
{
case 1: return ConfigurationManager.ConnectionStrings["entity_1"].ConnectionString;
case 2: return ConfigurationManager.ConnectionStrings["entity_2"].ConnectionString;
case 3: return ConfigurationManager.ConnectionStrings["entity_3"].ConnectionString;
default: return ConfigurationManager.ConnectionStrings["entity_1"].ConnectionString;
}
}
It works very good and also handles situation when session timed out everytime any data access is performed.
Recently as I needed to share DB classes among two websites I moved all DB classes to separate class library and referenced System.Web library which I know is bad practice, but it's working.
Now the next step is to include unit and module tests which as I read is very difficult or impossible when using HttpContext in library, so I want to get rid of System.Web references. What is the best practice for this situation?
I think I can't just pass HttpContext to GetEntityContext() as it is also called from within my entity classes. Although this probably can be refactored. So maybe this is where I should go?
I also wondered if is it possible to somehow pass current section ID to this whole library? It cannot be just static property because as far as I understand it would be common for all users using the application. This should be user-specific.
Reassuming the objective is to make automated testing possible without loosing transparent Connection String choosing and session timeouts handling.
If I do something fundamentally wrong at this stage please also let me know. I can look again at this question tomorrow morning (8.00 am UTC) so please don't be discouraged by my silence till then.
EDIT:
Example of usage of Connection class in the library:
public partial class Store
{
public static List<Store> GetSpecialStores()
{
using (var context = Connection.GetEntityContext())
{
return context.Stores.Where(qq => qq.Type > 0).OrderBy(qq => qq.Code).ToList();
}
}
}
You can declare interface IContextProvider inside your library ans use it to retrieve context. Something like:
public interface IContextProvider
{
MyEntities GetEntityContext();
}
This will make your library testable. In your web project you can inject IContextProvider implementation into your library.
public class WebContextProvider : IContextProvider
{
public MyEntities GetEntityContext()
{
if (HttpContext.Current.Session["section"] == null)
HttpContext.Current.Response.Redirect("~/Login.aspx");
int sectionId = (int)HttpContext.Current.Session["section"];
string connectionString = GetEntityConnectionStringForSection(sectionId);
var context = new MyEntities(connectionString);
return context;
}
private static string GetEntityConnectionStringForSection(int section)
{
switch (section)
{
case 1: return ConfigurationManager.ConnectionStrings["entity_1"].ConnectionString;
case 2: return ConfigurationManager.ConnectionStrings["entity_2"].ConnectionString;
case 3: return ConfigurationManager.ConnectionStrings["entity_3"].ConnectionString;
default: return ConfigurationManager.ConnectionStrings["entity_1"].ConnectionString;
}
}
}
Inject this interface to repositories or other data access classes.
public partial class Store
{
private IContextProvider contextProvider;
public Store(IContextProvider contextProvider)
{
this.contextProvider = contextProvider;
}
public List<Store> GetSpecialStores()
{
using (var context = contextProvider.GetEntityContext())
{
return context.Stores.Where(qq => qq.Type > 0).OrderBy(qq => qq.Code).ToList();
}
}
}
I hope this makes sense. I have a ASP.NET web application that uses Entity Framework. I have added a couple of custom tables to the db and created a separate project to handle the CRUD operations for those tables. I chose the separate project because I don't want future upgrades to the application to overwrite my custom features.
My problem is this. How do I attach/combine my custom ObjectContext to the ObjectContext of the application? I want to use the same UnitOfWorkScope (already in the application) to maintain the one ObjectContext instance per HTTP request. Again, I don't want to add my ObjectSets to the application's ObjectContext for my reason listed above.
Here is some code:
Widget.cs
public partial class Widget
{
public Widget()
{
}
public int WidgetId {get;set;}
public string WidgetName {get;set;}
}
WidgetObjectContext.cs
public partial class WidgetObjectContext : ObjectContext
{
private readonly Dictionary<Type, object> _entitySets;
public ObjectSet<T> EntitySet<T>()
where T : BaseEntity
{
var t = typeof(T);
object match;
if(!_entitySets.TryGetValue(t, out match))
{
match = CreateObjectSet<T>();
_entitySets.Add(t, match);
}
return (ObjectSet<T>)match;
}
public ObjectSet<Widget> Widgets
{
get
{
if((_widgets == null))
{
_widgets = CreateObjectSet<Widget>();
}
return _widget;
}
}
private ObjectSet<Widget> _widgets;
In my WidgetManager class if I was using the application's ObjectContext I would query my tables like this:
var context = ObjectContextHelper.CurrentObjectContext;
var query = from c in context.ObjectSet .... etc
What I want would be to do something like this:
var context = ObjectContextHelper.CurrentObjectContext.Attach(WidgetObjectContext);
I know this won't work but that is the gist of what I am trying to accomplish. Hope this is clear enough. Thanks.
I don't think it is possible. ObjectContext creates entity connection which connects to metadata describing mapping and database. But you have to different sets of metadata - one for ASP.NET application and one for separate project. Simply you need two connection to work with these models => you need two ObjectContexts.
FYI: The previous answer was correct at the time of the answer. It is now possible to do this using the DbContext available in EF 4.1. The caveat is that you must use the code-first strategy in order to build your custom context. In other words, you won't be able to use EDMX files to accomplish this.
I currently have an application which consists of:
User Interface (web page)
BLL (Manager & Domain Objects)
DAL (DataAccess class for each of my Domain Objects).
I use the following in the UI to search for a domain object.
protect sub Button1_Click()
{
IBook book = BookManager.GetBook(txtID.Text);
}
Here is my BLL
public class BookManager
{
public static IBook GetBook(string bookId)
{
return BookDB.GetBook(bookId);
}
}
public class Book : IBook
{
private int? _id
private string _name;
private string _genre;
public string Name
{
get { return _name; }
private set
{
if (string.IsNullOrEmpty(value))
throw new Exception("Invalid Name");
_name = value;
}
}
public string Genre
{
get { return _serial; }
private set
{
if (string.IsNullOrEmpty(value))
throw new Exception("Invalid Genre");
_genre = value;
}
}
// Other IBook Implementations
}
And finally here is my DAL
public class BookDB
{
public static IBook GetBook(int id)
{
// Get Book from database using sproc (not allowed to use any ORM)
// ?? Create IBook Item?
// return IBook
}
How would one create a IBook Object and return it to the Manager?
I'm thinking of returning a DataTable from BookDB to BookManager and having it create the Book Object and return it, but that doesn't seem right.
Is there another way to do this?
Edit:
I decided to seperate each layer into a project and ran into a circular dependency problem in the DAL layer when trying to add a reference to the BLL.
I can't access the Book Class or Interface or anything in BLL from DAL.
Should i just use ado.net objects here and have my manager create the actual object from the ado.net object?
Here's how its layed out
BLL.Managers - BookManager
BLL.Interfaces IBook
BLL.Domain - Book
DAL - BookDB.
Thanks!
You could create dummy Book objects that contain only data. Get, set properties and member values. This book, has 1 property for each field in the database, but doesn't validate anything.
You fill the object from the db, then send it to the BLL.
When you want to save the object, you also send it to the BLL.
Your classes in the BLL could wrap aroud those objects, if that makes sense. This way, it is easy to just send it back to the DAL.
Dummy Book:
public class DummyBook:IBook
{
private nullable<int> _id;
private string _name;
private string _genre;
public string Id
{
get {return _id;}
set {_id = value;}
}
public string Name
{
get {return _name;}
set {_name = value;}
}
public string Genre
{
get {return _genre;}
set {_genre= value;}
}
}
DAL Book:
public class DALBook
{
public static IBook:GetBook(int id)
{
DataTable dt;
DummyBook db = new DummyBook();
// Code to get datatable from database
// ...
//
db.Id = (int)dt.Rows[0]["id"];
db.Name = (string)dt.Rows[0]["name"];
db.Genre = (string)dt.Rows[0]["genre"];
return db;
}
public static void SaveBook(IBook book)
{
// Code to save the book in the database
// you can use the properties from the dummy book
// to send parameters to your stored proc.
}
}
BLL Book:
public class Book : IBook
{
private DummyBook _book;
public Book(int id)
{
_book = DALBook.GetBook(id);
}
public string Name
{
get {return _book.Name;}
set
{
if (string.IsNullOrEmpty(value))
{
throw new Exception("Invalid Name");
}
_book.Name = value;
}
}
// Code for other Properties ...
public void Save()
{
// Add validation if required
DALBook.Save(_book);
}
}
Edit1: The dummy classes should go in their own project(Model, just as stated in the comments is fine). The references would work as follow:
The DAL References the Model Project.
The BLL References the Model and the DAL.
The UI References the BLL.
BookDB should return the IBook instance. I like the repository pattern, which is all about mapping from the db to the domain.
The repository implementation returns instances of the domain objects. This shields the rest of the code from the particular persistence implementation, which can be affected by the technology (database type, web service, [insert something else]) and the format used to save the data.
I would probably use ExecuteReader to create an object in code from the database. The reason for this is that the datatable has more overhead than a reader, because it has more functionality (and was probably created by a reader). Since you aren't doing updates/deletes using the datatable, you don't need the overhead.
That being said, I would make a static helper method in the BookManager class.
internal static IBook BookFromReader(IDataReader reader)
{
Book B = new Book();
B.Prop = reader.GetString(0);
B.Rinse = reader.Repeat();
return B;
}
The reason for this is because the reason you have an interface is because you might want to change the implementation. You may eventuallu have INovel : IBook, IReference : IBook etc and then you'll want to have an abstract factory implementation in your data layer.
public static IBook GetBook(int id)
{
// SqlCommand Command = new Command("SQL or sproc", ValidConnection);
using(IDataReader DR = Command.ExecuteReader(id))
{
// checking omitted
switch(DR.GetInt32(1))
{
case 0:
return BookManager.BookFromReader(DR);
case 1:
return BookManager.NovelFromReader(DR);
etc
}
}
}
Another benefit of the DAL here is that you can cache lookups. You can have a Dictionary that holds books you've looked up, to reduce extra db calls on objects you've already returned. When an update takes place, you remove the cached entity... That's another post though.
If you're using multiple assemblies, interfaces and helper methods will need to reside in a neutral (non-dependent) assembly. Right now in the blog-o-sphere, there is movement towards less assemblies, which means less dependencies, etc.
Here is a link from a blog I read on this topic:
http://codebetter.com/blogs/patricksmacchia/archive/2008/12/08/advices-on-partitioning-code-through-net-assemblies.aspx
Ultimately, I think the answer is that the data layer returns an instance of your interface to the business layer.
Good luck :-)
In my opinion you should never let DAL access BLL. That is an unnecessarily dependency.
Putting the Book class into a new project (perhaps named DomainModel) will fix the circular reference. You could do something like this:
Project BLL reference DAL and DomainModel
Project DAL reference DomainModel
Project UI reference BLL and DomainModel
Project DomainModel reference nothing
The DataTable you want to return is database related, and for BLL, it shouldn't care about what database you are using and what the schema is.
You may use a DB-Object Mapper to map the dbtable to an object in DAL.
If you don't want to return a DataTable, you can pass in an IBook implementation from BookManager for the DAL to populate.
To follow the intended model. the Data Access Layer (DAL) is responsible for retrieving and sending data from and to the data source.
The DAL must not care about any of the business entities your BLL is using as its only job is to retrieve data and return it in a neutral object. It must be neutral for generic reuability, otherwise you might as well not separate the layers as you are defiting its purpose.
Your Business Logic Layer (BLL) must not care how the DAL achieves retrieveing or writing data.
To communicate between the BLL and the DAL you must use neutral objects.
Your BLL passes an object's properties as individual paramters to the methods in the DAL.
the parameters in the DAL are neutral using strings, int, bool, any other .NET objects which are neither specific to a version of the database you are communicating with nor are specific types only existing in your BLL.
The DAL will retrieve the data from where ever by what ever means and return a neutral data object to the caller. This for example could be a DataSet or DataTable or any other object NOT specific to a database type/version your are using. Hence DataSet and DataTable are objects within the System.Data namespace and not the System.Data.SQL,etc... namespace.
In essence:
- BLL passes neutral types to the DAL (e.g.: string, int, bool, long,float, etc..)
- DAL is responsible for converting those types to database specifci types if required before passing them on to the data source
DAL returns neutral data types to the BLL (e.g.: DataSet, DataTable,etc..)
- BLL is responsible for using the content of those neutral data types to create, populate and return specifci Business Entities
Your BLL must reference your DAL. that's it.
You can off course completly ignore this model and hack about as many suggested previously using IBOOK,etc... but than your are not using the intended model and might as well throw it all into a single assembly as you won't be able to maintain it independantly anyway.