Is there any equivalent of SQLBulkCopy in TeraData - teradata

After some goоgling, I could not find a proper replacement of SQLBulkCopy from SQLClient in TeraData. Can any body suggest me anything like SQLBulkCopy for TeraData to be used in c#? I need to insert up-to a few millions of rows in TD
Need this to compare a set a rows retrieved from external DB and dumped into TD and compare with data already available in TeraData.
Any suggestion is appreciated.

I couldn't find an equivalent but this is acceptably fast for my purposes. Also I suspect that the UpdateBatchSize could be tweaked to match your particular data to increase speeds.
As written, your source and destination tables must have the same columns (like BulkCopy, although not necessarily in the same order).
TdConnection tdCon = new TdConnection(tdConString);
SqlConnection sqlCon1 = new SqlConnection(serverOneConString);
// Get schema for destination table
var query = "SELECT * FROM [Destination_Table] where 0 = 1";
using (TdDataAdapter insertAdapter = new TdDataAdapter(query, tdCon))
{
DataSet ds = new DataSet();
insertAdapter.Fill(ds);
// Load data from source table
using (SqlDataAdapter dataAdapter = new SqlDataAdapter("SELECT * FROM [Source_Table]", sqlCon1)) {
dataAdapter.SelectCommand.CommandTimeout = 240;
dataAdapter.Fill(dt);
}
// Move data from source to destination, matching column names
foreach (DataRow row in dt.Rows) {
var newRow = ds.Tables[0].NewRow();
foreach (DataColumn column in dt.Columns) {
newRow[column.ColumnName] = row[column.ColumnName];
}
ds.Tables[0].Rows.Add(newRow);
}
TdCommandBuilder builder = new TdCommandBuilder(insertAdapter);
insertAdapter.UpdateBatchSize = 250;
insertAdapter.Update(ds);
}

Teradata's .Net provider can be used for loading, you need to set TdDataAdapter.UpdateBatchSize as high as possible, at least a few hundred.
If this is not fast enough for larger amounts of data you might switch to Teradata's TPT-API

Related

DBGrid, Bookmarks and SQLite very slow [duplicate]

I recently read about SQLite and thought I would give it a try. When I insert one record it performs okay. But when I insert one hundred it takes five seconds, and as the record count increases so does the time. What could be wrong? I am using the SQLite Wrapper (system.data.SQlite):
dbcon = new SQLiteConnection(connectionString);
dbcon.Open();
//---INSIDE LOOP
SQLiteCommand sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
dbcon.close();
Wrap BEGIN \ END statements around your bulk inserts. Sqlite is optimized for transactions.
dbcon = new SQLiteConnection(connectionString);
dbcon.Open();
SQLiteCommand sqlComm;
sqlComm = new SQLiteCommand("begin", dbcon);
sqlComm.ExecuteNonQuery();
//---INSIDE LOOP
sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
sqlComm = new SQLiteCommand("end", dbcon);
sqlComm.ExecuteNonQuery();
dbcon.close();
I read everywhere that creating transactions is the solution to slow SQLite writes, but it can be long and painful to rewrite your code and wrap all your SQLite writes in transactions.
I found a much simpler, safe and very efficient method: I enable a (disabled by default) SQLite 3.7.0 optimisation : the Write-Ahead-Log (WAL).
The documentation says it works in all unix (i.e. Linux and OSX) and Windows systems.
How ? Just run the following commands after initializing your SQLite connection:
PRAGMA journal_mode = WAL
PRAGMA synchronous = NORMAL
My code now runs ~600% faster : my test suite now runs in 38 seconds instead of 4 minutes :)
Try wrapping all of your inserts (aka, a bulk insert) into a single transaction:
string insertString = "INSERT INTO [TableName] ([ColumnName]) Values (#value)";
SQLiteCommand command = new SQLiteCommand();
command.Parameters.AddWithValue("#value", value);
command.CommandText = insertString;
command.Connection = dbConnection;
SQLiteTransaction transaction = dbConnection.BeginTransaction();
try
{
//---INSIDE LOOP
SQLiteCommand sqlComm = new SQLiteCommand(sqlQuery, dbcon);
nRowUpdatedCount = sqlComm.ExecuteNonQuery();
//---END LOOP
transaction.Commit();
return true;
}
catch (SQLiteException ex)
{
transaction.Rollback();
}
By default, SQLite wraps every inserts in a transaction, which slows down the process:
INSERT is really slow - I can only do few dozen INSERTs per second
Actually, SQLite will easily do 50,000 or more INSERT statements per second on an average desktop computer. But it will only do a few dozen transactions per second.
Transaction speed is limited by disk drive speed because (by default) SQLite actually waits until the data really is safely stored on the disk surface before the transaction is complete. That way, if you suddenly lose power or if your OS crashes, your data is still safe. For details, read about atomic commit in SQLite..
By default, each INSERT statement is its own transaction. But if you surround multiple INSERT statements with BEGIN...COMMIT then all the inserts are grouped into a single transaction. The time needed to commit the transaction is amortized over all the enclosed insert statements and so the time per insert statement is greatly reduced.
See "Optimizing SQL Queries" in the ADO.NET help file SQLite.NET.chm. Code from that page:
using (SQLiteTransaction mytransaction = myconnection.BeginTransaction())
{
using (SQLiteCommand mycommand = new SQLiteCommand(myconnection))
{
SQLiteParameter myparam = new SQLiteParameter();
int n;
mycommand.CommandText = "INSERT INTO [MyTable] ([MyId]) VALUES(?)";
mycommand.Parameters.Add(myparam);
for (n = 0; n < 100000; n ++)
{
myparam.Value = n + 1;
mycommand.ExecuteNonQuery();
}
}
mytransaction.Commit();
}

Update always encrypted column from decrypted column

I would like to encrypt an existing database column with always encrypted. My project is a ASP.NET project using code first and database is SQL Server. The database has already data. I created a migration to achieve my goal.
First I tried to alter the column type, using the following.
ALTER TABLE [dbo].[TestDecrypted] ALTER COLUMN [FloatCol] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL
I got the following error.
Operand type clash: float is incompatible with float encrypted with (encryption_type = 'RANDOMIZED', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'TestEncrypt')
Then I decided to created another column and migrate the data.
ALTER TABLE [dbo].[TestDecrypted] ADD [FloatCol2] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = [FloatCol]
And I got the same error.
After I looked at this, I noticed that it is possible to insert data like the following
DECLARE #floatCol FLOAT = 1.1
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #floatCol
But if I try to obtain the value from my existing column, it fails.
DECLARE #floatCol FLOAT = (SELECT TOP 1 FloatCol FROM TestDecrypted)
UPDATE [dbo].[TestDecrypted] SET FloatCol2 = #floatCol
The error follows.
Encryption scheme mismatch for columns/variables '#floatCol'. The encryption scheme for the columns/variables is (encryption_type = 'PLAINTEXT') and the expression near line '4' expects it to be (encryption_type = 'RANDOMIZED', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'TestEncrypt').
Does anyone knows how can I achieve my goal?
Update 1
#Nikhil-Vithlani-Microsoft did some interesting suggestions.
Always Encrypted Wizard in SSMS - I would like to achieve my goal with code first migrations, so this idea does not fit.
SqlBulkCopy - It does not work inside migrations, because the new column will only exist after all 'Up' method is run. Therefore we cannot insert data into this column in this way inside this method.
Anyway, his suggestions drove me to another attempt: obtain the decrypted values and update the encrypted column with them.
var values = new Dictionary<Guid, double>();
var connectionString = ConfigurationManager.ConnectionStrings["MainDb"].ConnectionString;
using (var sourceConnection = new SqlConnection(connectionString))
{
var myCommand = new SqlCommand("SELECT * FROM dbo.TestDecrypted", sourceConnection);
sourceConnection.Open();
using (var reader = myCommand.ExecuteReader())
{
while (reader.Read())
{
values.Add((Guid)reader["Id"], (double)reader["FloatCol"]);
}
}
}
Sql("ALTER TABLE [dbo].[TestDecrypted] ADD [FloatCol2] [float] ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NULL");
foreach (var valuePair in values)
{
// The error occurs here
Sql($#"DECLARE #value FLOAT = {valuePair.Value}
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #value WHERE Id = '{valuePair.Key}'");
}
In fact, I did not try to create another column and to migrate the data, as mentioned in an example above. I tried it only on SSMS.
And now I got a different error.
Transaction (Process ID 57) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction.
I tried to do it without encrypting the new column, and it worked properly.
Any idea why this error occurs?
You will have to do the always encrypted related migration outside of entity framework. This blog should help
https://blogs.msdn.microsoft.com/sqlsecurity/2015/08/27/using-always-encrypted-with-entity-framework-6/
If you want to encrypt an existing column, you can use Always Encrypted Wizard in SSMS, or use this article that explains how to migrate existing data.
Also, please note that doing bulk inserts through a C# (.NET 4.6.1+ client) app is supported.
You can do this in c# using SqlBulkCopy specifically using SqlBulkCopy.WriteToServer(IDataReader) Method.
Create a new table (encryptedTable) with the same schema as that of your plaintext table (unencryptedTable) but with the encryption turned on for the desired columns.
Do select * from unencryptedTable to load the data in a SqlDataReader then use SqlBulkCopy to load it to the encryptedTable using SqlBulkCopy.WriteToServer(IDataReader) Method
For example,
Plaintext Table
CREATE TABLE [dbo].[Patients](
[PatientId] [int] IDENTITY(1,1),
[SSN] [char](11) NOT NULL)
Encrypted Table
CREATE TABLE [dbo].[Patients](
[PatientId] [int] IDENTITY(1,1),
[SSN] [char](11) COLLATE Latin1_General_BIN2
ENCRYPTED WITH (ENCRYPTION_TYPE = DETERMINISTIC,
ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256',
COLUMN_ENCRYPTION_KEY = CEK1) NOT NULL)
As for why your method does not work,
when you use parameterization for always encrypted, the right hand side (RHS) of the declare statement needs to be a literal. Because the driver will identify the literal and encrypt it for you. So, the following will not work, since RHS is a sql expression and cannot be encrypted by the driver
DECLARE #floatCol FLOAT = (SELECT TOP 1 FloatCol FROM TestDecrypted)
UPDATE [dbo].[TestDecrypted] SET FloatCol2 = #floatCol
Update:
The following code will not work because parameterization for Always Encrypted only applies to SSMS
foreach (var valuePair in values)
{
// The error occurs here
Sql($#"DECLARE #value FLOAT = {valuePair.Value}
UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #value WHERE Id = '{valuePair.Key}'");
}
However, if you rewrite your code as follows, that should work
foreach (var valuePair in values)
{
SqlCommand cmd = _sqlconn.CreateCommand();
cmd.CommandText = #"UPDATE [dbo].[TestDecrypted] SET [FloatCol2] = #FloatVar WHERE Id = '{valuePair.Key}'");";
SqlParameter paramFloat = cmd.CreateParameter();
paramFloat.ParameterName = #"#FloatVar";
paramFloat.DbType = SqlDbType.Float;
paramFloat.Direction = ParameterDirection.Input;
paramFloat.Value = floatValue;
cmd.Parameters.Add(paramFloat);
cmd.ExecuteNonQuery();
}
Hope that helps, if you have additional question, please leave them in the comments.

Retrieve all rows from DataTable in asp.net

I have created a temporary datatable to store temporary data before send it to database. But after creating that now i want to fetch all the rows from this temporary datatable and save them to data base. For this i loop my datatable using foreach loop
foreach(DataRow r in dt.Rows)
{
string Fname = dt.Rows[0]["Name"].ToString();
string cType = dt.Rows[0]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[0]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}
The problem is that it only fetch the data of first row again and again for the whole loop count. Like if I have 4 datarows with different information, then this will store the data of 1st row in the database for 4 times. What mistake am i doing?
dt.Rows[0] means you are always accessing the first row of your DataTable.You need to use a for loop instead of a foreach and use i to access each index of your table:
for (int i = 0; i < dt.Rows.Count; i++)
{
string Fname = dt.Rows[i]["Name"].ToString();
string cType = dt.Rows[i]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[i]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}
You can use for loop to fetch all value regarding their columns in datatable to assign value in their variables so you can use this way. Because if you use hard coded index of datatable rows then it should fetch only single data every time.
Here i is index of rows and column name to get value from rows. So use dynamically based indexing use for get all records.
for (int i = 0; i < dt.Rows.Count; i++)
{
string Fname = dt.Rows[i]["Name"].ToString();
string cType = dt.Rows[i]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[i]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}

SqlCommand slow when executing a query with parameters

DbDataAdapter.Fill() is extremly slow when performing parameters!
I have a query with 2 parameters inside, and when I put those parameters hardcoded in the query it takes 1 second to execute (in a 470k table rows, returning only 20 rows).
I found many posts similars here and I tried all those solutions (set arithabort, option recompile, option optimize for, ...) with no luck.
I just perform a query (sql server 2008) and not a stored procedure, so the query with arithabort is like this:
string strSql = #"set ARITHABORT ON;
select TOP 20 ....
Also I tried to call set arithabort in the same transaction but performing that query first..
I don't know if I'm doing something wrong, but the sensation is the ado.net is performing a very bad execution plan in ado.net when I have defined parameters on it.
As a result of this bad choice, the execution time in SSMS is 1 second (after being cached) but in asp is like 9 seconds!
The query is something like this:
strSQL #="
select *
from Table1
where Name like #name";
And then:
DbProviderFactory factory = DbProviderFactories.GetFactory(mProvider);
DbCommand dbcmd = factory.CreateCommand();
if (CommandTimeout != null)
dbcmd.CommandTimeout = CommandTimeout.Value;
if(this.transaccion != null)
dbcmd.Transaction = this.transaccion;
dbcmd.Connection = dbc;
dbcmd.CommandText = strSQL;
if (parametros != null)
dbcmd.Parameters.AddRange(parametros);
DbDataAdapter dbda = factory.CreateDataAdapter();
dbda.SelectCommand = dbcmd;
DataTable dt = new DataTable();
dbda.Fill(dt);
return dt;
EDIT 14/01/2013 (18:44)
I'm not longer retrieve the connection from DbProviderFactory, insted I'm using directly SqlConnection and SqlCommand. I know DbCommand and DbProvider are a base clase... but I think there is something more in there.. because the performance drasticaly increase like 300%!
It's not the fill method, because I already tried in the code shown before..
Anyway, I don't know the reason why but using a SqlConnection is much faster! Any idea? Maybe isn't making that bad execution plan made before?
SqlCommand objCmd = new SqlCommand(strSQL, sqlConn);
if (CommandTimeout != null)
objCmd.CommandTimeout = CommandTimeout.Value;
if (this.transaccion != null)
objCmd.Transaction = SQLtransaccion;
if (parametros != null)
objCmd.Parameters.AddRange(parametros);
DbDataReader dbReader = objCmd.ExecuteReader();
DataTable dt = new DataTable();
dt.Load(dbReader);
dbReader.Close();
return dt;
Any help will be greatly appreciated,
Thanks,
I found the solution!
It was parameters!
I was using a wrong type in the the List!
Parametross.Add(bd.MakeParameter("#val", "%" + txtFind.Text + "%",
DbType.String));
DbType.String vs. DbType.AnsiString
Although both DbType.String and DbType.AnsiString deal with character data, these datatypes are processed differently, and using the wrong data type can have a negative effect on the application’s performance. DbType.String identifies the parameter as a 2-byte Unicode value and is sent to the server as such.DbType.AnsiString causes the parameter to be sent as a multibyte character string. To avoid excessive string conversions, use:
DbType.AnsiString for char or varchar columns and parameters.
DbType.String for unichar and univarchar columns and parameters.
Source:
http://infocenter.sybase.com/help/index.jsp?topic=/com.sybase.infocenter.dc20066.0115/html/adonet/adonet49.htm
In my query there is a:
....
where Table.Col1 like #val
But the column type was varchar and I should use DbType.AnsiString, instead of DbType.String
Parametross.Add(bd.MakeParameter("#val", "%" + txtFind.Text + "%",
DbType.AnsiString));
In my huge table I was making a lot of unnecesary casts and this is the reason why the performance drastically fall down!
Hope this will help someone,

Upload image to server using C#/.NET and storing filename in DB

I'm currently using the following snippet to insert data into a table in my database. It works great. But, I want to start adding filename data and not sure how to proceed.
I have the following:
// Create command
comm = new SqlCommand(
"INSERT INTO Entries (Title, Description) " +
"VALUES (#Title, #Description)", conn);
// Add command parameters
comm.Parameters.Add("#Description", System.Data.SqlDbType.Text);
comm.Parameters["#Description"].Value = descriptionTextBox.Text;
comm.Parameters.Add("#Title", System.Data.SqlDbType.NVarChar, 50);
comm.Parameters["#Title"].Value = titleTextBox.Text;
I also have a File Upload option. But, I don't know how to use this to do the following:
move the file to my images directory and
store the filename value in my table.
I have added the correct enctype to my form but now a little lost.
Can someone explain the best way to do this?
Many thanks for any help with this.
To store the file in an images folder, it should be:
FileUpload1.SaveAs(Server.MapPath("~/Images/" + FileUpload1.FileName));
and then add the command parameters in the fileName
comm.Parameters["#FileName"].Value = FileUpload1.FileName;
Note: you must have the FileName field in your DB table.
I suggest storing file in the db too. This will guarantee data consistency.
Add column to the DB. Replace X with the suitable size if the image is less than 8000, or specify varbinary(MAX) if it is not.
alter table Entries
add FileContent varbinary(X) not null
C# code:
byte[] fileContent = yourFileContent;
using(var connection = new SqlConnection(connectionString))
using (var command = connection.CreateCommand())
{
command.CommandText = #"
INSERT INTO Entries (Title, Description, FileContent)
VALUES (#Title, #Description, #FileContent)
";
command.Parameters.AddWithValue("Description", descriptionTextBox.Text);
command.Parameters.AddWithValue("Title", titleTextBox.Text);
command.Parameters.AddWithValue("FileContent", fileContent);
connection.Open();
command.ExecuteScalar();
}

Resources