.NET Core Informix multiple parameters in IfxCommand - .net-core

This is the environment I work in:
.NET Core 3.1 (console application for testing purposes)
CSDK 4.50.FC5
Informix.Net.Core.dll from the CSDK 4.50.FC5 package
Informix Server 12.10
The problem I have is that some of the queries from my .NET Core app are executed successfully and results are retrieved from the Informix database, but sometimes I get weird errors that have to do something with parameters. Of course, I am trying to use IfxParameters in order to be safe from SQL Injection attacks.
This passes succesfully (table names and columns are made up). Here I am using positional parameters:
IfxCommand command = connection.CreateCommand();
command.CommandType = System.Data.CommandType.Text;
command.CommandText = "SELECT * FROM employees WHERE name LIKE '%John%' SKIP ? LIMIT ?";
IfxParameter paramSkip = new IfxParameter("skip", IfxType.Integer);
paramSkip.Value = 30;
command.Parameters.Add(paramSkip);
IfxParameter paramLimit = new IfxParameter("limit", IfxType.Integer);
paramLimit.Value = 10;
command.Parameters.Add(paramLimit);
connection.Open();
using (IfxDataReader reader = command.ExecuteReader())
{
... // reading data from the IfxDataReader
}
Let's see now this example which produces an error:
IfxCommand command = connection.CreateCommand();
command.CommandType = System.Data.CommandType.Text;
command.CommandText = "SELECT * FROM employees WHERE name LIKE ? SKIP 10 LIMIT ?";
IfxParameter paramSearch = new IfxParameter("searchQuery", IfxType.VarChar);
paramSearch.Value = "%John%";
command.Parameters.Add(paramSearch);
IfxParameter paramLimit = new IfxParameter("limit", IfxType.Integer);
paramLimit.Value = 10;
command.Parameters.Add(paramLimit);
con.Open();
using (IfxDataReader reader = command.ExecuteReader())
{
...
}
Error:
IfxException: ERROR [22018] [Informix][Informix ODBC Driver][Informix]A character to numeric conversion process failed
The only difference here is that I have parameters that are not of the same type. In the first example both parameters were IfxType.Integer, and now I have IfxType.Integer and IfxType.VarChar
Using named parameters doesn't help. For the following IfxCommand:
command.CommandText = "SELECT * FROM employees WHERE name LIKE '%John%' SKIP 10 LIMIT #limit";
I get the following error:
Error: IfxException: ERROR [42000] [Informix][Informix ODBC Driver][Informix]A syntax error has occurred.
I hope someone will be able to point me in the right direction in order to solve this issue. I am open to any suggestion that will solve this. If any further info is needed, please hit me up!

This is because of the precedence of positional parameters. "where" parameters have lower precedence than "skip" and "limit". Therefore, create the "where IfxParameter" parameters (paramSearch) at the end.
Try:
IfxCommand command = connection.CreateCommand();
command.CommandType = System.Data.CommandType.Text;
command.CommandText = "SELECT * FROM employees WHERE name LIKE ? SKIP 10 LIMIT ?";
IfxParameter paramLimit = new IfxParameter("limit", IfxType.Integer);
paramLimit.Value = 10;
command.Parameters.Add(paramLimit);
IfxParameter paramSearch = new IfxParameter("searchQuery", IfxType.VarChar);
paramSearch.Value = "%John%";
command.Parameters.Add(paramSearch);
...

Related

How to use "LIKE" in parameterized Cosmos query

I'm using The Azure.Cosmos API, version 3.17.1. If I run a query against my Cosmos collection as so, I get the results I expect:
string Sql = "SELECT c.LastName FROM c where c.LastName like 'Smi%'";
QueryDefinition oQry = new QueryDefinition(Sql);
FeedIterator<myObj> oFI = this.container.GetItemQueryIterator<myObj>(oQry); // returns Smith, Smithers, etc.
If I try to parameterize this, I get nothing back:
string Sql = "SELECT c.LastName FROM c where c.LastName like '#KeyWord%'";
QueryDefinition oQry = new QueryDefinition(Sql);
oQry.WithParameter(#KeyWord, "Smi");
FeedIterator<myObj> oFI = this.container.GetItemQueryIterator<myObj>(oQry);
Is this a syntax issue or something not supported?
Tks
Don't use % in your parameter names and don't need to quote it.
Please try this code:
string Sql = "SELECT c.LastName FROM c where c.LastName like #KeyWord";
QueryDefinition oQry = new QueryDefinition(Sql);
oQry.WithParameter("#KeyWord", "Smi%");
FeedIterator<myObj> oFI = container.GetItemQueryIterator<myObj>(oQry);

DBGrid, Bookmarks and SQLite very slow [duplicate]

I recently read about SQLite and thought I would give it a try. When I insert one record it performs okay. But when I insert one hundred it takes five seconds, and as the record count increases so does the time. What could be wrong? I am using the SQLite Wrapper (system.data.SQlite):
dbcon = new SQLiteConnection(connectionString);
dbcon.Open();
//---INSIDE LOOP
SQLiteCommand sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
dbcon.close();
Wrap BEGIN \ END statements around your bulk inserts. Sqlite is optimized for transactions.
dbcon = new SQLiteConnection(connectionString);
dbcon.Open();
SQLiteCommand sqlComm;
sqlComm = new SQLiteCommand("begin", dbcon);
sqlComm.ExecuteNonQuery();
//---INSIDE LOOP
sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
sqlComm = new SQLiteCommand("end", dbcon);
sqlComm.ExecuteNonQuery();
dbcon.close();
I read everywhere that creating transactions is the solution to slow SQLite writes, but it can be long and painful to rewrite your code and wrap all your SQLite writes in transactions.
I found a much simpler, safe and very efficient method: I enable a (disabled by default) SQLite 3.7.0 optimisation : the Write-Ahead-Log (WAL).
The documentation says it works in all unix (i.e. Linux and OSX) and Windows systems.
How ? Just run the following commands after initializing your SQLite connection:
PRAGMA journal_mode = WAL
PRAGMA synchronous = NORMAL
My code now runs ~600% faster : my test suite now runs in 38 seconds instead of 4 minutes :)
Try wrapping all of your inserts (aka, a bulk insert) into a single transaction:
string insertString = "INSERT INTO [TableName] ([ColumnName]) Values (#value)";
SQLiteCommand command = new SQLiteCommand();
command.Parameters.AddWithValue("#value", value);
command.CommandText = insertString;
command.Connection = dbConnection;
SQLiteTransaction transaction = dbConnection.BeginTransaction();
try
{
//---INSIDE LOOP
SQLiteCommand sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
transaction.Commit();
return true;
}
catch (SQLiteException ex)
{
transaction.Rollback();
}
By default, SQLite wraps every inserts in a transaction, which slows down the process:
INSERT is really slow - I can only do few dozen INSERTs per second
Actually, SQLite will easily do 50,000 or more INSERT statements per second on an average desktop computer. But it will only do a few dozen transactions per second.
Transaction speed is limited by disk drive speed because (by default) SQLite actually waits until the data really is safely stored on the disk surface before the transaction is complete. That way, if you suddenly lose power or if your OS crashes, your data is still safe. For details, read about atomic commit in SQLite..
By default, each INSERT statement is its own transaction. But if you surround multiple INSERT statements with BEGIN...COMMIT then all the inserts are grouped into a single transaction. The time needed to commit the transaction is amortized over all the enclosed insert statements and so the time per insert statement is greatly reduced.
See "Optimizing SQL Queries" in the ADO.NET help file SQLite.NET.chm. Code from that page:
using (SQLiteTransaction mytransaction = myconnection.BeginTransaction())
{
using (SQLiteCommand mycommand = new SQLiteCommand(myconnection))
{
SQLiteParameter myparam = new SQLiteParameter();
int n;
mycommand.CommandText = "INSERT INTO [MyTable] ([MyId]) VALUES(?)";
mycommand.Parameters.Add(myparam);
for (n = 0; n < 100000; n ++)
{
myparam.Value = n + 1;
mycommand.ExecuteNonQuery();
}
}
mytransaction.Commit();
}

Update always encrypted column from decrypted column

I would like to encrypt an existing database column with always encrypted. My project is a ASP.NET project using code first and database is SQL Server. The database has already data. I created a migration to achieve my goal.
First I tried to alter the column type, using the following.
ALTER TABLE [dbo].[TestDecrypted] ALTER COLUMN [FloatCol] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL
I got the following error.
Operand type clash: float is incompatible with float encrypted with (encryption_type = 'RANDOMIZED', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'TestEncrypt')
Then I decided to created another column and migrate the data.
ALTER TABLE [dbo].[TestDecrypted] ADD [FloatCol2] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = [FloatCol]
And I got the same error.
After I looked at this, I noticed that it is possible to insert data like the following
DECLARE #floatCol FLOAT = 1.1
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #floatCol
But if I try to obtain the value from my existing column, it fails.
DECLARE #floatCol FLOAT = (SELECT TOP 1 FloatCol FROM TestDecrypted)
UPDATE [dbo].[TestDecrypted] SET FloatCol2 = #floatCol
The error follows.
Encryption scheme mismatch for columns/variables '#floatCol'. The encryption scheme for the columns/variables is (encryption_type = 'PLAINTEXT') and the expression near line '4' expects it to be (encryption_type = 'RANDOMIZED', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'TestEncrypt').
Does anyone knows how can I achieve my goal?
Update 1
#Nikhil-Vithlani-Microsoft did some interesting suggestions.
Always Encrypted Wizard in SSMS - I would like to achieve my goal with code first migrations, so this idea does not fit.
SqlBulkCopy - It does not work inside migrations, because the new column will only exist after all 'Up' method is run. Therefore we cannot insert data into this column in this way inside this method.
Anyway, his suggestions drove me to another attempt: obtain the decrypted values and update the encrypted column with them.
var values = new Dictionary<Guid, double>();
var connectionString = ConfigurationManager.ConnectionStrings["MainDb"].ConnectionString;
using (var sourceConnection = new SqlConnection(connectionString))
{
var myCommand = new SqlCommand("SELECT * FROM dbo.TestDecrypted", sourceConnection);
sourceConnection.Open();
using (var reader = myCommand.ExecuteReader())
{
while (reader.Read())
{
values.Add((Guid)reader["Id"], (double)reader["FloatCol"]);
}
}
}
Sql("ALTER TABLE [dbo].[TestDecrypted] ADD [FloatCol2] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL");
foreach (var valuePair in values)
{
// The error occurs here
Sql($#"DECLARE #value FLOAT = {valuePair.Value}
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #value WHERE Id = '{valuePair.Key}'");
}
In fact, I did not try to create another column and to migrate the data, as mentioned in an example above. I tried it only on SSMS.
And now I got a different error.
Transaction (Process ID 57) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
I tried to do it without encrypting the new column, and it worked properly.
Any idea why this error occurs?
You will have to do the always encrypted related migration outside of entity framework. This blog should help
https://blogs.msdn.microsoft.com/sqlsecurity/2015/08/27/using-always-encrypted-with-entity-framework-6/
If you want to encrypt an existing column, you can use Always Encrypted Wizard in SSMS, or use this article that explains how to migrate existing data.
Also, please note that doing bulk inserts through a C# (.NET 4.6.1+ client) app is supported.
You can do this in c# using SqlBulkCopy specifically using SqlBulkCopy.WriteToServer(IDataReader) Method.
Create a new table (encryptedTable) with the same schema as that of your plaintext table (unencryptedTable) but with the encryption turned on for the desired columns.
Do select * from unencryptedTable to load the data in a SqlDataReader then use SqlBulkCopy to load it to the encryptedTable using SqlBulkCopy.WriteToServer(IDataReader) Method
For example,
Plaintext Table
CREATE TABLE [dbo].[Patients](
[PatientId] [int] IDENTITY(1,1),
[SSN] [char](11) NOT NULL)
Encrypted Table
CREATE TABLE [dbo].[Patients](
[PatientId] [int] IDENTITY(1,1),
[SSN] [char](11) COLLATE Latin1_General_BIN2
ENCRYPTED WITH (ENCRYPTION_TYPE = DETERMINISTIC,
ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256',
COLUMN_ENCRYPTION_KEY = CEK1) NOT NULL)
As for why your method does not work,
when you use parameterization for always encrypted, the right hand side (RHS) of the declare statement needs to be a literal. Because the driver will identify the literal and encrypt it for you. So, the following will not work, since RHS is a sql expression and cannot be encrypted by the driver
DECLARE #floatCol FLOAT = (SELECT TOP 1 FloatCol FROM TestDecrypted)
UPDATE [dbo].[TestDecrypted] SET FloatCol2 = #floatCol
Update:
The following code will not work because parameterization for Always Encrypted only applies to SSMS
foreach (var valuePair in values)
{
// The error occurs here
Sql($#"DECLARE #value FLOAT = {valuePair.Value}
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #value WHERE Id = '{valuePair.Key}'");
}
However, if you rewrite your code as follows, that should work
foreach (var valuePair in values)
{
SqlCommand cmd = _sqlconn.CreateCommand();
cmd.CommandText = #"UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #FloatVar WHERE Id = '{valuePair.Key}'");";
SqlParameter paramFloat = cmd.CreateParameter();
paramFloat.ParameterName = #"#FloatVar";
paramFloat.DbType = SqlDbType.Float;
paramFloat.Direction = ParameterDirection.Input;
paramFloat.Value = floatValue;
cmd.Parameters.Add(paramFloat);
cmd.ExecuteNonQuery();
}
Hope that helps, if you have additional question, please leave them in the comments.

Firebird insert...returning asp.net

I'm using Firebird 2.5 and asp.net (4.5).
I'm trying to find out how to use insert ... returning, or some equivalent.
Using fbDataReader, it executes the insert OK, but I can't find anyway of accessing a returned value. Using fbDataReader.GetName(0) seems to work ok, returning the variable name in the "returning" clause. This even applies to a max() in a subselect:
..... returning (select max(userid) as newid from users)
returns the text "newid".
I can't find where, or whether, the value is available.
Using a fbDataAdaptor to fill a DataTable, the insert works OK, but data table seems empty.
Does anyone know whether this is possible, and if so, how it's done?
Thanks
EDIT
Code supplied :
strConn = ....
dbConn = New FirebirdSql.Data.FirebirdClient.FbConnection(strConn)
dbConn.Open()
MySQL = "insert into users (Firstname, Lastname) VALUES (#fname,#lname) returning userid"
FbC = New FirebirdSql.Data.FirebirdClient.FbCommand(MySQL, dbConn)
FbC.Parameters.Add("fname", FirebirdSql.Data.FirebirdClient.FbDbType.Text).Value = "Pete"
FbC.Parameters.Add("lname", FirebirdSql.Data.FirebirdClient.FbDbType.Text).Value = "Davis"
FbDataReader = FbC.ExecuteReader()
FbDataReader.Read()
TextBox1.Text = FbDataReader.GetName(0)
'TextBox1.Text = str(FbDataReader.GetInt64())
'TextBox1.Text = FbDataReader.GetString(0)
TextBox1.Text = FbDataReader.GetValue(0)
According to this thread INSERT ... RETURNING ... behaves like output parameters for the Firebird .NET provider. So you will need to add an output parameter.
So something like the code below should work:
FbParameter outParam = new FbParam("userid", FbDbType.Integer)
{
Direction = ParameterDirection.Output
};
FbC.Parameters.Add(outParam);
FbC.ExecuteNonQuery();
int? userId = outParam.Value as int?;

SqlCommand slow when executing a query with parameters

DbDataAdapter.Fill() is extremly slow when performing parameters!
I have a query with 2 parameters inside, and when I put those parameters hardcoded in the query it takes 1 second to execute (in a 470k table rows, returning only 20 rows).
I found many posts similars here and I tried all those solutions (set arithabort, option recompile, option optimize for, ...) with no luck.
I just perform a query (sql server 2008) and not a stored procedure, so the query with arithabort is like this:
string strSql = #"set ARITHABORT ON;
select TOP 20 ....
Also I tried to call set arithabort in the same transaction but performing that query first..
I don't know if I'm doing something wrong, but the sensation is the ado.net is performing a very bad execution plan in ado.net when I have defined parameters on it.
As a result of this bad choice, the execution time in SSMS is 1 second (after being cached) but in asp is like 9 seconds!
The query is something like this:
strSQL #="
select *
from Table1
where Name like #name";
And then:
DbProviderFactory factory = DbProviderFactories.GetFactory(mProvider);
DbCommand dbcmd = factory.CreateCommand();
if (CommandTimeout != null)
dbcmd.CommandTimeout = CommandTimeout.Value;
if(this.transaccion != null)
dbcmd.Transaction = this.transaccion;
dbcmd.Connection = dbc;
dbcmd.CommandText = strSQL;
if (parametros != null)
dbcmd.Parameters.AddRange(parametros);
DbDataAdapter dbda = factory.CreateDataAdapter();
dbda.SelectCommand = dbcmd;
DataTable dt = new DataTable();
dbda.Fill(dt);
return dt;
EDIT 14/01/2013 (18:44)
I'm not longer retrieve the connection from DbProviderFactory, insted I'm using directly SqlConnection and SqlCommand. I know DbCommand and DbProvider are a base clase... but I think there is something more in there.. because the performance drasticaly increase like 300%!
It's not the fill method, because I already tried in the code shown before..
Anyway, I don't know the reason why but using a SqlConnection is much faster! Any idea? Maybe isn't making that bad execution plan made before?
SqlCommand objCmd = new SqlCommand(strSQL, sqlConn);
if (CommandTimeout != null)
objCmd.CommandTimeout = CommandTimeout.Value;
if (this.transaccion != null)
objCmd.Transaction = SQLtransaccion;
if (parametros != null)
objCmd.Parameters.AddRange(parametros);
DbDataReader dbReader = objCmd.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dbReader);
dbReader.Close();
return dt;
Any help will be greatly appreciated,
Thanks,
I found the solution!
It was parameters!
I was using a wrong type in the the List!
Parametross.Add(bd.MakeParameter("#val", "%" + txtFind.Text + "%",
DbType.String));
DbType.String vs. DbType.AnsiString
Although both DbType.String and DbType.AnsiString deal with character data, these datatypes are processed differently, and using the wrong data type can have a negative effect on the application’s performance. DbType.String identifies the parameter as a 2-byte Unicode value and is sent to the server as such.DbType.AnsiString causes the parameter to be sent as a multibyte character string. To avoid excessive string conversions, use:
DbType.AnsiString for char or varchar columns and parameters.
DbType.String for unichar and univarchar columns and parameters.
Source:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc20066.0115/html/adonet/adonet49.htm
In my query there is a:
....
where Table.Col1 like #val
But the column type was varchar and I should use DbType.AnsiString, instead of DbType.String
Parametross.Add(bd.MakeParameter("#val", "%" + txtFind.Text + "%",
DbType.AnsiString));
In my huge table I was making a lot of unnecesary casts and this is the reason why the performance drastically fall down!
Hope this will help someone,

Resources