I'm evaluating some technologies for a new Web Application. Which should use EF5 and Knockout JS with Web API. I wanted to take advantage of the OData feature, when returning IQueryable, but am currently running into the problem, how to convert my EF Models to my Business Models.
As far as I've read, if I want to have a more complex DB (Computed Columns, Stored Procedures, ...) I should use DB First approach. (Correct me if I'm wrong)
Because I need to use DB-First approach and want my models to be Independent of the DB, I need to create them additionally to the EF-Models. And when I return from the DataLayer my Business Model as IQueryable I loose the possibility to execute additional queries directly on the DB but instead they are executed on the ASP.Net server directly.
Of course I don't plan to run complex queries over OData and would anyway implement those as additional actions, but it might be useful on the slower clients (smartphones, ...) to limit the returned data and perform additional filters directly on the server.
Is there any way out of this dilemma, to be still able to use OData?
Regards
Peter
You can try using Code First and EF migrations to create/upgrade database. With migrations you can create custom migrations that can be just SQL scripts to achieve what can't be done automatically with Code First itself. Database First approach is fine as well.
Ask yourself if you really want to/need to support multiple backends. It is possible with EF but hard to maintain. In this case I assume your conceptual model (csdl) would be the same for all databases but you would have multiple store specific models (ssdl files). Since your model would be the same for all databases you would have the same C# types regardless of the database you are using.
When supporting multiple databases you won't be able to run SQL queries against the database (or to be more specific you will get exceptions if you run SQL query specific to one database against a different database) but ideally you should not need it. In the worst you could enclose the logic you would like to code in SQL in a stored procedure that would exist in all databases. Again, I don't know when this would be needed (the only thing that comes to mind is performance) but since you are planning using OData you wouldn't be able to run these queries anyways unless you start using Service Operations.
Since your conceptual model would be the same regardless of the database you would have the same types regardless of the database. You could try using these for both DataLayer and Business Model (especially if you go with POCO). An alternative would be to use POCO/DTOs. (I have not tried OData support in Web API but with WCF Data Services the service itself would actually use EF types so you would not be even able to tell the service to use different set of types).
You actually don't lose the ability with DB first models to execute queries against the business model, as long as your transforms aren't too bad. For example, I have a OData service which has a PersistedCustomer (DB model) and a Customer (Business model). With EF5, I can write the LINQ which transforms the IQueryable to IQueryable, query against the IQueryable and EF can translate the criteria back against the original database.
Related
I've started using EF Core for the 1st time with asp.net core and I must confess that I'm not finding the experience that good (specially if you're coming from NH). Now, I've got the following scenario:
2 different DbContexts and each context uses different schemas
the API has been set up so that some API operation are wrapped by a transaction (in practice, everything has been set up so that there's a single sqlconnection that's shared by both contexts)
I'd like to test these kind of methods, but I'm not sure on the best approach...After reading the docs, it looks like the best option is using SQLite, but there's a small gotach: setting up the db.
The docs say that I can use the context.Database.EnsureCreated method. Unfortunately, that will only work if the in memory db hasn't been created yet. In other words, after calling if from the 1st context instance, it won't do anything when called over the 2nd context instance (because both context share the same db and it has already been created after the 1st call). In practice, this means that I'll end up with a partial db that has the tables mapped to the entities of the 1st context.
Is there a way to force the creation of the 2nd context tables? Ie, can I write something like this with EF Core:
contextA.Database.EnsureCreated();
contextB.Database.JustCreateTheTablesPlease();
Or do I need to recreate my db from a SQL script before running my tests?
Thanks.
I have to call an oracle procedure in phalcon framework. Does anyone know how to call it in phalcon model.
I have tried but it doesn't work.
Please help!
There are two ways to perform this task and none of them are Phalcon related.
To call a stored procedure you will need to use PDO prepared statements. Since Phalcon implements PDO you will be able to do this using the db service and not the models. Information on how to do this is here:
http://php.net/manual/en/pdo.prepared-statements.php
This also depends on whether you have installed the oci PDO related extension.
The second way is to use the oracle supplied methods such as:
oci_connect
oci_parse (sql statement here)
oci_bind_by_name(bind each parameter to a php variable)
oci_execute
oci_free_statement
You could potentially create models of your own that will encapsulate the above and call the relevant stored procedure. Upon receiving the data back you can instantiate a resultset object and populate it with the returned data.
This will offer the normal resultset back to your application but you won't be able to do much with it since you rely on stored procedures and not a straight up model->table relationship.
There are long discussions and sometimes heated ones on why one should or should not use stored procedures and even more why one should or should not use Oracle. One thing is clear, stored procedures with all their benefits do offer a level of complexity and restrictions to the developers. In Oracle's case and with the notorious cursors, those restrictions are a bit more acute.
One last thing to note is that if you create your own models (nothing to do with Phalcon) you can have the stored procedure variables as properties in each model. That way you will be able to set them, make your call to the stored procedure (see the oci_* functions above) with a say call() function in that model, and then update the object's properties again with the returned variables from the stored procedure. This model will be able to serve you with the basic CRUD operations by calling relevant stored procedures that will allow this CRUD but expose methods that are a bit friendlier to you say insert(), get(), delete() etc.
I use Fluent NHibernate code to create a MySQL database SessionFactory. No config files (just one value for the connection string in configuration - connectionStrings section of configuration file).
The SessionFactory creation code is contained in a Data tier class: SessionFactoryManager, which implements a singleton internal SessionFactory which is used by the Data and Business tiers to get all the sessions via SessionFactoryManager.OpenSession().
Some of my Business tier methods internally call SessionFactoryManager.OpenSession() to create sessions in a way that is transparent to the Presentation layer. So, when calling this methods there is no parameter or return value involving a session (to keep the Presentation layer "session-agnostic" when using those Business tier methods).
My problem comes when I write the integration tests for the Business layer: I would like to make them run on a SQLite in-memory database. I create a SessionFactoryManager which uses Fluent configuration to configure the SQLite database.
But when testing those methods that internally create the session, I can not tell them to use my testing SessionFactory (configured to use SQLite). So the "real" SessionFactory is called, and so the MySql database is used, not the SQLite.
I'm thinking of several possible solutions, but none of them seems right.
I could migrate the NHibernate configuration in Data layer to config files, and make different NHibernate config files for development/production and test environments, but I really would prefer to keep on with Fluent code.
I could also modify my Data layer to use a single configuration value, databaseMode or similar, that sets the database to be used: testing in-memory or the real one. And write some switch(databaseMode) statements like "case inMemory: { ... fluent code for in-memory SQLite... } case standard: { ... fluent code for standard database ... }". I don't like this approach at all, I don't want to modify my Data tier code functionality just for testing purposes.
Notice that I'm not testing Data layer, but Business layer. Not interested in testing NHibernate mappings, Dao or similar functionality. I already have unit tests for that, running OK with SQLite database.
Also, changing database is not a requirement of my application, so I'm not quite interested in implementing significant changes that allow me to dynamically change the DBMS, I only came to this need in order to write the tests.
A significant point: when using in-memory SQLite the database connection must be the same for all new sessions, otherwise the database objects are not available to the new sessions. So when creating a new session with SessionFactory.OpenSession() a parameter "connection" must be provided. But this parameter should not be used with non in-memory database. So the switch(databaseMode) should be used for any single session creation! Another Data layer code change that I don't like at all.
I'm seriously considering giving up and running my tests with the real database, or at least on an empty one, with its objects created and dropped for any test execution. But with this the test execution will surely be slower. Any ideas? Thanks in advance.
Finally my solution was Inversion Of Control: I changed my data tier so I can inject a custom SessionFactoryBuilder class that makes the Fluently.Configure(...) magic.
In my data tier I use the "real" MySqlSessionFactoryBuilder, in my test projects I write TestMySqlFactoryBuilder or TestSQLiteSessionFactoryBuilder classes, or whatever I need.
I still have problems with SQLite feature that requires that the same connection is used for all sessions, and must be passed as a parameter in every ISession.Open() call. By the moment I have not modified my data tier to add that feature, but I would like to do it in the future. Probably by adding to my SessionFactory singleton a static private member to store the connection used to make SchemaExport, and a static private boolean member like PreserveConnection to state that this connection must be stored in that private member and used in every ISession.Open(). And also wrap ISession.Open() and make sure that no session is opened directly.
I'm wondering which is better approach from performance point of view, is it better to use one web-service method to load data by passing Database Table name and keys or is it better to use separate method for each database table! knowing that i'm using .net asmx through ajax requests.
it's obvious that one method is better from OO perspective since it have one function type 'data loading' but what about performance? does IIS affected by that or not? also is it better to make multi web-services 'asmx files' or just one!
I really dont think that creating separate methods for handling data fetch different tables is necessary. The performance gain\loss that u r likely to experience by passing an additional table name param to your webservice call would be too small to even consider unless your table names are really huge, which i dont think is the case.
The only reason i would even consider doing some thing like this is if i have nothing else to do in terms of performance improvement or if being forced to do it ;-).
If you really want to optimize your request size try
serializing your input params using JSON (if you are not doing it already)
use a cookieless domain for your webservice
hope this helps
I don't think the service level should have any knowledge of database tables, just like you ideally don't want to see data access code in a controller action or ASPX's code behind.
Personally, I prefer to organize my services to match my domain model.
If I have Customer, Order, and Item classes, for example, I would have corresponding Customer.asmx, Order.asmx, and Item.asmx services to expose selected methods within those classes.
Services are typically responsible for exposing business functionality through a contract. I realize ASMX services really had not concept of "Contracts" in their broadest sense, however you think of it as a set of operations supported by the service. What is your goal here, do you want to expose tabular data as a service ?
Service technology on the Microsoft stack has come a long way from ASMX. Perhaps an obvious question, have you looked at WCF Data Services?
Links:
Exposing Your Data as a Service (WCF Data Services)
Getting Started with WCF Data Services
I used to create normal webservices in my websites, and call these services from javascript to make ajax calls.
Now i am learning about Ado Data Services,
My question is:
Does this Ado Data Services can replace my normal webservice in new sites i will create?
And if Yes,
Can i put these Ado Data Services in a separate project "local on the same server" and just reference from my website? "to use the same services for my websites internal use and also give the same services to other websites or services, the same as twitter for example doing"
depends what you want to do , I suggest you read my conversation with Pablo Castro the architect of Ado.Net Data Services
Data Services - Lacking
Here is basically Pablo's words.
I agree that some of these things are quite inconvenient and we're looking at fixing them (e.g. use of custom types in addition to types defined in the input model in order to produce custom result-sets). However, some others are just intrinsic to the nature of Data Services.
The Data Services framework is not a gateway to a database and in general if you need something like that then Data Services will just get in the way. The goal of Data Services is to create a resource model out of an input data model, and expose it with a RESTful interface that exposes the uniform interface, such that every unit of data in the underlying model ("entities") become an addressable resource that can be manipulated with the standard verbs.
Often the actual implementation of a RESTful interface includes more sophisticated behaviors than just doing CRUD over the data under the covers, which need to be defined in a way that doesn't break the uniform interface. That's why the Data Services server runtime has hooks for business logic and validation in the form of query/change interceptors and others. We also acknowledge that it's not always possible or maybe practical to model absolutely everything as resources operated with standard verbs, so we included service operations as a escape-hatch.
Things like joins dilute the abstraction we're trying to create. I'm not saying that they are bad or anything (relational databases without them wouldn't be all that useful), it's just that if what's required for a given application scenario is the full query expressiveness of a relational database to be available at the service boundary, then you can simply exchange queries over the wire (and manage the security implications of that). For joins that can be modeled as association traversals, then data services already has support for them.
I guess this is a long way to say that Data Services is not a solution for every problem that involves exposing data to the web. If you want a RESTful interface over a resource model that matches our underlying data model, then it usually works out well and it will save you a lot of work. If you need a custom inteface or direct access to a database, then Data Services is typically not the right tool and other framework components such as WCF's SOAP and REST support do a great job at that.