Does SQLite support stored procedures? - sqlite

I am evaluating the SQLite database for my requirement. Does SQLite support stored procedures?
If yes then what are the limitations? Does SQLite support the range?

No, it does not. See Appropriate Uses For SQLite on the main site.

A key reason for having stored procs in a database is that you're executing SP code in the same process as the SQL engine. This makes sense for database engines designed to work as a network connected service but the imperative for SQLite is much less. Given that it run as a DLL in your current process it makes more sense to implement SP in the client language.
You can however extend SQLite with your own user defined functions in the host language (PHP, Python, Perl, C#, Javascript, Ruby etc). I've done this in C# using DevArt's SQLite to implement password hashing.

If you really want to store SQL code in the DB (such as when you want to develop cross-platform apps), you can create a specific table that will store raw SQL commands that do the thing, then in the client you obtain the SQL command. e.g.
var myString = db.CreateCommand("SELECT SqlColumn FROM tablewithsqlcommands WHERE Procname=theprocedureIwant").ExecuteScalar();
and then execute it in the second step
var myResult = db.CreateCommand(myString).whatever_execution_method_you_need();

Related

Changing the database from SQL Server to PostgreSQL in creating ASP.NET web application?

I am currently using SQL Server for database and Dapper (ORM) for mapping relation to model classes. I have used multiple reads in Dapper so I am able to get multiple tables at one call to the database. But due to a certain circumstance, I have to change my database connection from SQL Server to PostgreSQL. In Postgresql, there are no options facilities for using the power of query multiple reads.
Is there any way to handle the below situation in Postgresql?
using (var multi = conn.QueryMultiple(query, param, commandType: CommandType.StoredProcedure))
{
obj.InventoryItemOrder = multi.Read<InventoryItemOrder>()
.FirstOrDefault(); //getting single object data (so firstordefault)
obj.InventoryItemDataModel = multi.Read<InventoryItemDataModel>(); //list
}
Can I use this concept when dealing with PostgreSQL and Dapper in building an ASP.NET application?
I'm not familiar with Dapper, but from a quick look at the doc, it appears that QueryMutiple basically runs multiple statements in the same command (that is, separated by a semicolon) and then maps the results of each statement.
You can certainly do the first part of that with Postgres: combine multiple statements into one separated by a semicolon. They will both be run. However, I am not aware of anything within Postgres itself that will automatically return results of the statements separately. Depending on exactly how you are interfacing with Postgres and getting the results, you could perhaps create a custom method of mapping these results.
From a purely database perspective, you don't gain much, if any, performance advantage from the behavior described in QueryMultiple as long as you issue the separate queries on the same connection.
It's the connection startup and teardown that is expensive, and even Dapper would have to issue and map the results for each query, so there is no performance benefit there. Rather, it's essentially a matter of syntactic sugar.
It's not clear from a quick look at the doc if Dapper is compatible with Postgres. If it is, and its QueryMultiple call is supported there, then it's probably handling the mapping of the multiple statements within the ORM itself.
If Dapper does not support Postgres, however, I would recommend simply issuing the queries separately and handling their results separately.

Encrypt the SQL query

In my C++ program, I will invoke SQLite to execute SQL queries. But these queries are just stored as normal string constants in C++ so it is easy to be decoded via reverse engineering method. Does SQLite provide a good way to encrypt the SQL query strings while does not affect the performance when executing the queries?
Thanks

SDO vs DB Adapter oracle 11g

I publish this post in order to reveal the underlying idea of the real use of this tecnology.
I know this isn't a common question, but it doesn't mean that it isn't important.
If you were trying to work with lots of tables of a Database, and you were using lots of BPEL Services, would you choose SDO (Service Data Objects) instead of DBAdapters (DataBase Adapters)??
I have been working for few weeks with SDOs and I find these really useful, but I'm not sure if the use of SDOs is better than DB Adapters or not...
What do you think about this?? SDOs or DBAdapters??
Thanks in advance.
Basically SDO is Oracle SOA's attempt at an ORM and therefore you can simple look for information on ORM compared to JDBC. The DBAdaper is slight different from plain JDBC in it has extra features around polling and stored procedure integration.
DBAdapter for -> Simple SQL, Stored procedures, basic read write, delete and update and polling
SDO -> Highly reuseable code and everything that doesn't suit DBAdapter.
Here is a thread to look at http://forum.spring.io/forum/spring-projects/data/14117-jdbc-or-orm-framework-what-are-the-pros-and-cons

using SQLite with mod_perl

I have been successfully using SQLite as a data store for my web applications, but now I am implementing a web site with mod_perl, and am running into database locking issues.
As expected, my entire web application is loaded by the Plack Apache handler (Plack::Handler::Apache2) when the web server is started. Well, the first db query creates a lock on the entire database, and any subsequent query that has to modify the db fails.
What is my way out? Can I use SQLite in a persistent web environment or not? Should I be looking for some other db store?
I am not a fan of MySQL, and don't want to use it. I could potentially use PostGres, but I'd rather use something lightweight, and preferably sql-based as using key/value databases such as Tokyo Cabinet would require learning a whole new way. I'd rather really use SQLite.
If you have an open handle to the database, it can cause this issue. I have had problems when iterating over a result set during a log process causes the lock to stick around.
Try and fetch all the rows for the query and call $sth->finish() to clear up the lock. You will use a little more memory, but you will avoid the locking.
Knowing you are going to do this, you can make use of $sth->fetchall_arrayref() or $sth->fetchall_hashref()
Use Tokyo Cabinet's table database.

Which is fastest to transmit: XML or DataTables?

I would like to know which is faster. Let me give you the scenario. I'm on a LAN, have a report to build using data from a SQL Server database (if we need the version let's say 2005) and have these ways of getting the report done:
Have a web service at the server, where the data is taken from the server and serialized into XML. The client uses this XML as a source for a report that is built in the client machine. The cliente would be a windows form app.
From the client side, connect to the database using ADO.Net, get a DataTable and uses as a source for the report built in the client.
The same as (2) but using a DataReader.
Also, is there a better way to do this?
The serialization to XML is going to cost both in terms of the time it takes to do it, the overhead of the XML structure, and the time to deserialize. It will, however, provide a format that is consumable by more technologies. If you are using .NET end-to-end, and that isn't likely to change, I would not use XML, but use the framework-provided data access methods. Personally, I would probably use LINQ over DataTables or a DataReader but that more for ease of use and readability on the client-side than any performance advantage.
The best practice is to not use .NET-specific types in the interface of a web service. Even if you are certain today that your service will never be called by anything other than a .NET program, things change, and tomorrow you may be told that the service will be called by a Perl program.
Perl programs don't understand DataSet. Nor do Java programs, nor anything other than .NET.
The best practice is to create a Data Transfer Object containing just the data you need to transfer, in simple properties with primitive types, or collections or arrays of primitive types, or collections or arrays of Data Transfer Objects, etc. These will be understandable by any client.

Resources