I have to call an oracle procedure in phalcon framework. Does anyone know how to call it in phalcon model.
I have tried but it doesn't work.
Please help!
There are two ways to perform this task and none of them are Phalcon related.
To call a stored procedure you will need to use PDO prepared statements. Since Phalcon implements PDO you will be able to do this using the db service and not the models. Information on how to do this is here:
http://php.net/manual/en/pdo.prepared-statements.php
This also depends on whether you have installed the oci PDO related extension.
The second way is to use the oracle supplied methods such as:
oci_connect
oci_parse (sql statement here)
oci_bind_by_name(bind each parameter to a php variable)
oci_execute
oci_free_statement
You could potentially create models of your own that will encapsulate the above and call the relevant stored procedure. Upon receiving the data back you can instantiate a resultset object and populate it with the returned data.
This will offer the normal resultset back to your application but you won't be able to do much with it since you rely on stored procedures and not a straight up model->table relationship.
There are long discussions and sometimes heated ones on why one should or should not use stored procedures and even more why one should or should not use Oracle. One thing is clear, stored procedures with all their benefits do offer a level of complexity and restrictions to the developers. In Oracle's case and with the notorious cursors, those restrictions are a bit more acute.
One last thing to note is that if you create your own models (nothing to do with Phalcon) you can have the stored procedure variables as properties in each model. That way you will be able to set them, make your call to the stored procedure (see the oci_* functions above) with a say call() function in that model, and then update the object's properties again with the returned variables from the stored procedure. This model will be able to serve you with the basic CRUD operations by calling relevant stored procedures that will allow this CRUD but expose methods that are a bit friendlier to you say insert(), get(), delete() etc.
Related
I've started using EF Core for the 1st time with asp.net core and I must confess that I'm not finding the experience that good (specially if you're coming from NH). Now, I've got the following scenario:
2 different DbContexts and each context uses different schemas
the API has been set up so that some API operation are wrapped by a transaction (in practice, everything has been set up so that there's a single sqlconnection that's shared by both contexts)
I'd like to test these kind of methods, but I'm not sure on the best approach...After reading the docs, it looks like the best option is using SQLite, but there's a small gotach: setting up the db.
The docs say that I can use the context.Database.EnsureCreated method. Unfortunately, that will only work if the in memory db hasn't been created yet. In other words, after calling if from the 1st context instance, it won't do anything when called over the 2nd context instance (because both context share the same db and it has already been created after the 1st call). In practice, this means that I'll end up with a partial db that has the tables mapped to the entities of the 1st context.
Is there a way to force the creation of the 2nd context tables? Ie, can I write something like this with EF Core:
contextA.Database.EnsureCreated();
contextB.Database.JustCreateTheTablesPlease();
Or do I need to recreate my db from a SQL script before running my tests?
Thanks.
I use Fluent NHibernate code to create a MySQL database SessionFactory. No config files (just one value for the connection string in configuration - connectionStrings section of configuration file).
The SessionFactory creation code is contained in a Data tier class: SessionFactoryManager, which implements a singleton internal SessionFactory which is used by the Data and Business tiers to get all the sessions via SessionFactoryManager.OpenSession().
Some of my Business tier methods internally call SessionFactoryManager.OpenSession() to create sessions in a way that is transparent to the Presentation layer. So, when calling this methods there is no parameter or return value involving a session (to keep the Presentation layer "session-agnostic" when using those Business tier methods).
My problem comes when I write the integration tests for the Business layer: I would like to make them run on a SQLite in-memory database. I create a SessionFactoryManager which uses Fluent configuration to configure the SQLite database.
But when testing those methods that internally create the session, I can not tell them to use my testing SessionFactory (configured to use SQLite). So the "real" SessionFactory is called, and so the MySql database is used, not the SQLite.
I'm thinking of several possible solutions, but none of them seems right.
I could migrate the NHibernate configuration in Data layer to config files, and make different NHibernate config files for development/production and test environments, but I really would prefer to keep on with Fluent code.
I could also modify my Data layer to use a single configuration value, databaseMode or similar, that sets the database to be used: testing in-memory or the real one. And write some switch(databaseMode) statements like "case inMemory: { ... fluent code for in-memory SQLite... } case standard: { ... fluent code for standard database ... }". I don't like this approach at all, I don't want to modify my Data tier code functionality just for testing purposes.
Notice that I'm not testing Data layer, but Business layer. Not interested in testing NHibernate mappings, Dao or similar functionality. I already have unit tests for that, running OK with SQLite database.
Also, changing database is not a requirement of my application, so I'm not quite interested in implementing significant changes that allow me to dynamically change the DBMS, I only came to this need in order to write the tests.
A significant point: when using in-memory SQLite the database connection must be the same for all new sessions, otherwise the database objects are not available to the new sessions. So when creating a new session with SessionFactory.OpenSession() a parameter "connection" must be provided. But this parameter should not be used with non in-memory database. So the switch(databaseMode) should be used for any single session creation! Another Data layer code change that I don't like at all.
I'm seriously considering giving up and running my tests with the real database, or at least on an empty one, with its objects created and dropped for any test execution. But with this the test execution will surely be slower. Any ideas? Thanks in advance.
Finally my solution was Inversion Of Control: I changed my data tier so I can inject a custom SessionFactoryBuilder class that makes the Fluently.Configure(...) magic.
In my data tier I use the "real" MySqlSessionFactoryBuilder, in my test projects I write TestMySqlFactoryBuilder or TestSQLiteSessionFactoryBuilder classes, or whatever I need.
I still have problems with SQLite feature that requires that the same connection is used for all sessions, and must be passed as a parameter in every ISession.Open() call. By the moment I have not modified my data tier to add that feature, but I would like to do it in the future. Probably by adding to my SessionFactory singleton a static private member to store the connection used to make SchemaExport, and a static private boolean member like PreserveConnection to state that this connection must be stored in that private member and used in every ISession.Open(). And also wrap ISession.Open() and make sure that no session is opened directly.
I'm evaluating some technologies for a new Web Application. Which should use EF5 and Knockout JS with Web API. I wanted to take advantage of the OData feature, when returning IQueryable, but am currently running into the problem, how to convert my EF Models to my Business Models.
As far as I've read, if I want to have a more complex DB (Computed Columns, Stored Procedures, ...) I should use DB First approach. (Correct me if I'm wrong)
Because I need to use DB-First approach and want my models to be Independent of the DB, I need to create them additionally to the EF-Models. And when I return from the DataLayer my Business Model as IQueryable I loose the possibility to execute additional queries directly on the DB but instead they are executed on the ASP.Net server directly.
Of course I don't plan to run complex queries over OData and would anyway implement those as additional actions, but it might be useful on the slower clients (smartphones, ...) to limit the returned data and perform additional filters directly on the server.
Is there any way out of this dilemma, to be still able to use OData?
Regards
Peter
You can try using Code First and EF migrations to create/upgrade database. With migrations you can create custom migrations that can be just SQL scripts to achieve what can't be done automatically with Code First itself. Database First approach is fine as well.
Ask yourself if you really want to/need to support multiple backends. It is possible with EF but hard to maintain. In this case I assume your conceptual model (csdl) would be the same for all databases but you would have multiple store specific models (ssdl files). Since your model would be the same for all databases you would have the same C# types regardless of the database you are using.
When supporting multiple databases you won't be able to run SQL queries against the database (or to be more specific you will get exceptions if you run SQL query specific to one database against a different database) but ideally you should not need it. In the worst you could enclose the logic you would like to code in SQL in a stored procedure that would exist in all databases. Again, I don't know when this would be needed (the only thing that comes to mind is performance) but since you are planning using OData you wouldn't be able to run these queries anyways unless you start using Service Operations.
Since your conceptual model would be the same regardless of the database you would have the same types regardless of the database. You could try using these for both DataLayer and Business Model (especially if you go with POCO). An alternative would be to use POCO/DTOs. (I have not tried OData support in Web API but with WCF Data Services the service itself would actually use EF types so you would not be even able to tell the service to use different set of types).
You actually don't lose the ability with DB first models to execute queries against the business model, as long as your transforms aren't too bad. For example, I have a OData service which has a PersistedCustomer (DB model) and a Customer (Business model). With EF5, I can write the LINQ which transforms the IQueryable to IQueryable, query against the IQueryable and EF can translate the criteria back against the original database.
Hello (i dont have any expirience on tests). There is a Web Applicaction that only has web forms, that print data read from stored procedures, and insert data using stored procedures as well. All the business logic is in the stored procedures. I have to say i dont like the way the application was done, but is mandatory to have as much tests as possible. Most of the methods return void (becouse they are for instance a button_clicked method that read from text boxes and call the stored procedure, the stored procedure does everything) So i cant do unit testing. Could you recommend me some tests that i can do and document, that fits to this web application please? thanks a lot.
If most of your business logic sits in stored procedures, it's best that you implement some tests for your procs.
Have a look at the TST Framework for testing your sql.
http://tst.codeplex.com/
As for the website, what you can do is probably ensure that you are calling the correct stored procedures on the invocation of some buttons.
e.g. you don't want to call the "ChangeName" stored procedure when the "Calculate Monthly Salary" button is clicked.
I'm making my first WCF Service and I am unsure which route I should take with stored procedures and Linq to Sql. I understand that I can drag and drop stored procs to my DBML file and call them that way, or call them directly, not using the dbml. Is there a reason why i should choose one over the other? I guess I'm a little confused... Any input is greatly appreciated!
Well, do you already have a Linq-to-SQL data model, which you use in your WCF service? If so, I would probably put my stored procedures into that data model.
If you don't already have and use a Linq-to-SQL data model, I don't really see much use and sense in creating one just to be able to call a stored procedure.
If you don't already have a Linq-to-SQL data model, I'd probably just use the straight ADO.NET code to call that stored procedure, send in any parameters coming from the WCF service method, and passing back any data you need to send back. In that case, you'd use a SqlConnection, a SqlCommand (CommandType set to StoredProcedure), a bunch of SqlParameters, and then call the command.ExecuteNonQuery() or command.ExecuteReader() methods (depending on what your stored proc is doing).
If you come across a situation in which you may need to manipulate the arguments of the stored procedure dynamically, you may want to create a class that calls the stored procedure via WCF Service. Check how is done in this post.
Executing SPs from WCF Service