I have created a temporary datatable to store temporary data before send it to database. But after creating that now i want to fetch all the rows from this temporary datatable and save them to data base. For this i loop my datatable using foreach loop
foreach(DataRow r in dt.Rows)
{
string Fname = dt.Rows[0]["Name"].ToString();
string cType = dt.Rows[0]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[0]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}
The problem is that it only fetch the data of first row again and again for the whole loop count. Like if I have 4 datarows with different information, then this will store the data of 1st row in the database for 4 times. What mistake am i doing?
dt.Rows[0] means you are always accessing the first row of your DataTable.You need to use a for loop instead of a foreach and use i to access each index of your table:
for (int i = 0; i < dt.Rows.Count; i++)
{
string Fname = dt.Rows[i]["Name"].ToString();
string cType = dt.Rows[i]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[i]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}
You can use for loop to fetch all value regarding their columns in datatable to assign value in their variables so you can use this way. Because if you use hard coded index of datatable rows then it should fetch only single data every time.
Here i is index of rows and column name to get value from rows. So use dynamically based indexing use for get all records.
for (int i = 0; i < dt.Rows.Count; i++)
{
string Fname = dt.Rows[i]["Name"].ToString();
string cType = dt.Rows[i]["ContentType"].ToString();
byte[] ePic = (byte[])dt.Rows[i]["pic"];
BAL.saveEventPictures(Convert.ToInt32(lblEventID.Text), Fname, cType, ePic);
}
Related
After some goŠ¾gling, I could not find a proper replacement of SQLBulkCopy from SQLClient in TeraData. Can any body suggest me anything like SQLBulkCopy for TeraData to be used in c#? I need to insert up-to a few millions of rows in TD
Need this to compare a set a rows retrieved from external DB and dumped into TD and compare with data already available in TeraData.
Any suggestion is appreciated.
I couldn't find an equivalent but this is acceptably fast for my purposes. Also I suspect that the UpdateBatchSize could be tweaked to match your particular data to increase speeds.
As written, your source and destination tables must have the same columns (like BulkCopy, although not necessarily in the same order).
TdConnection tdCon = new TdConnection(tdConString);
SqlConnection sqlCon1 = new SqlConnection(serverOneConString);
// Get schema for destination table
var query = "SELECT * FROM [Destination_Table] where 0 = 1";
using (TdDataAdapter insertAdapter = new TdDataAdapter(query, tdCon))
{
DataSet ds = new DataSet();
insertAdapter.Fill(ds);
// Load data from source table
using (SqlDataAdapter dataAdapter = new SqlDataAdapter("SELECT * FROM [Source_Table]", sqlCon1)) {
dataAdapter.SelectCommand.CommandTimeout = 240;
dataAdapter.Fill(dt);
}
// Move data from source to destination, matching column names
foreach (DataRow row in dt.Rows) {
var newRow = ds.Tables[0].NewRow();
foreach (DataColumn column in dt.Columns) {
newRow[column.ColumnName] = row[column.ColumnName];
}
ds.Tables[0].Rows.Add(newRow);
}
TdCommandBuilder builder = new TdCommandBuilder(insertAdapter);
insertAdapter.UpdateBatchSize = 250;
insertAdapter.Update(ds);
}
Teradata's .Net provider can be used for loading, you need to set TdDataAdapter.UpdateBatchSize as high as possible, at least a few hundred.
If this is not fast enough for larger amounts of data you might switch to Teradata's TPT-API
I am working on .net platform. I have created a word template(.dotx) and added mergefields in it. i am replacing merge field from database column values. Now i want to loop on merge field.
for example :- i have merge fields named < DepartmentID > and < DepartmentName > . And i am getting data from database's table named "tblDepartment" which have two columns "DepID" and "DepName". this table have 5 rows. i want to loop on these mergefields.
i want to show file like this.
before execute the docx file is :-
< DepartmentID > < DepartmentName >
after code execute:-
DepID DepName
1 DotNet
2 Java
3 PHP
Here Is My Code:
foreach (Word.Field myMergeField in wordDoc.Fields)
{
Word.Range rngFieldCode = myMergeField.Code;
String fieldText = rngFieldCode.Text;
// ONLY GETTING THE MAILMERGE FIELDS
if (fieldText.StartsWith(" MERGEFIELD"))
{
Int32 endMerge = fieldText.IndexOf("\\");
Int32 fieldNameLength = fieldText.Length - endMerge;
String fieldName = fieldText.Substring(11, endMerge - 11);
fieldName = fieldName.Trim().Replace("\"", "");
if (fieldName == "DepartmentID")
{
myMergeField.Select();
wordApp.Selection.TypeText("DepartmentID");
}
if (fieldName == "DepartmentName")
{
EAccreditationEntities dbModel = new EAccreditationEntities();
var q = (from p in dbModel.Departments select p).ToList();
//here q has all data from database , have columns (ID and Name of Department)
myMergeField.Select();
wordApp.Selection.TypeText("DEPNAME");
}
}
And the problem is that , my document have five pages , merge fields on 5th page and sql query returns 5 rows. when i am executing my code ,it shows first record on 5th page, then 6,7,8,9 pages create with 1,2,3,4 pages content and on 10th page shows 2nd record and so on.... Whole process creates 25 pages document file.
But i want to show all data on 5th page only. and not want to create extra pages.
I have an UTF string with \0 character and text field in a sqlite table.
When I tried to insert the string into table text field and then read it from the database I noticed that string value was truncated after \0 character.
Question: Is it possible to so save/restore such strings in sqlite without losing data after \0?
The code snippet:
public static void IssueWith0Character()
{
const string sql = "DROP TABLE IF EXISTS SomeTable;" +
"CREATE TABLE SomeTable (SomeField TEXT not null);"
+ "INSERT INTO SomeTable (SomeField) Values ( :value )";
var csb = new SQLiteConnectionStringBuilder
{DataSource = "stringWithNull.db", Version = 3};
// string with '0' character
const string stringWithNull = "beforeNull\0afterNull";
using (var c = new SQLiteConnection(csb.ConnectionString))
{
c.Open();
using (var cmd = c.CreateCommand())
{
var p = new SQLiteParameter(":value", DbType.String) {Value = stringWithNull};
cmd.CommandText = sql;
cmd.Parameters.Add(p);
cmd.ExecuteNonQuery();
}
using (var cmd = c.CreateCommand())
{
cmd.CommandText = "SELECT SomeField FROM SomeTable;";
var restoredValue = (string) cmd.ExecuteScalar();
Debug.Assert(stringWithNull == restoredValue);
}
}
}
UPDATE #1 It looks like problem is on reading stage. At least "afterNull" part of a string exists in the database file.
UPDATE #2 That was considered as System.Data.SQLite bug (<1.04.84). http://system.data.sqlite.org/index.html/tktview/3567020edf12d438cb7cf757b774ff3a04dc381e
In SQLite, \0 characters are considered invalid.
While it is possible to put such strings into the database (using the pointer+length form of various functions), many functions that operate on strings stop when encountering the \0. Therefore, the documentation says:
The result of expressions involving strings with embedded NULs is undefined.
If your really need to store data with null bytes, you should store it as a blob (DbType.Binary).
I am new to ASP.NET. I want to add a column to GridView dynamically based on the response of an API
id User secretcode
1 u1 {response from the API based on the Id value}
2 u1 {response from the API based on the Id value}
3 u1 {response from the API based on the Id value}
4 u1 {response from the API based on the Id value}
5 u1 {response from the API based on the Id value}
id and User are already in my database table (users), so for each returned row, I want to call API to fill my 3rd column i.e. secretcode. Basically, I am confused with where to use ForEach loop.
This is rough code on which I am working on:
DataTable table = new DataTable();
DataColumn col3 = new DataColumn("Secretcode");
col3.DataType = System.Type.GetType("System.Int");
table.Columns.Add(col3);
row[col3] = {response data from API}
gvTest.DataSource = table;
gvTest.DataBind();
Perhaps something like this
DataTable table = new DataTable();
DataColumn col = new DataColumn("Secretcode");
table.Columns.Add(col);
for(int i = 0; i < table.Rows.Count; i++)
{
// Where 'SomeAPICall()' is calling the API and returning the
// correct data type. If it is returning an object you may want
// to convert it before you attach it to the table
table.Rows[i]["Secretcode"] = SomeAPICall(table.Rows[i]["id"]);
}
gvTest.DataSource = table;
gvTest.DataBind();
Or if you're sold on the idea of the foreach loop:
DataTable table = new DataTable();
DataColumn col = new DataColumn("Secretcode");
table.Columns.Add(col);
foreach(DataRow row in table.Rows)
{
// Where 'SomeAPICall()' is calling the API and returning the
// correct data type. If it is returning an object you may want
// to convert it before you attach it to the table
row["Secretcode"] = SomeAPICall(row["id"]);
}
gvTest.DataSource = table;
gvTest.DataBind();
I prefer using for loops generally because often you want to use the same index number on 2 different collections which you can't really do with a foreach loop. In this case though it won't matter.
I am trying to format XML data to display on a grid.
Page1.aspx. This inserts XML data stored a xmldatatype:
WorkHistory workhis = js.Deserialize<WorkHistory>(json);
XmlDocument work = (XmlDocument)JsonConvert.DeserializeXmlNode(json, "root");
objBLL.insert_XMLWork(work, Convert.ToInt64(ui.id));
Page2.aspx retrieves it and display on a grid:
DataTable FBWorkDt = objBLL.get_FacebookWork(FacebookUserId);
GrdWork.DataSource = FBWorkDt;
GrdWorkPub.DataBind();
get_FacebookWork(select workinfo from Fprofiles where Userid = FacebookUserId)
returns a DataTable
It displays in this format exactly.
WorkInfo
<root><work><employer><id>208571635850052</id><name>Netizen Apps</name></employer></work><id>1076483621</id></root>
How do I make a normal display instead of XML format?
Thanks
Sun
It depends a good deal on the shape of the DataTable you're returning, but assuming you want the display to be something like this:
`ID Name
-------------------- ---------------------
208571635850052 Netizen Apps`
You could use LINQ:
DataTable FBWorkDt = objBLL.get_FacebookWork(FacebookUserId);
var query = from x in FBWorkDt.AsEnumerable()
select new {
id = x.ID,
name = x.Name
};
GrdWork.DataSource = query.ToList();
GrdWorkPub.DataBind();
I haven't tried the code out, so there may be minor syntatic changes, but essentially what it's doing is:
Use LINQ to get a collection of a new anonymous type that has one entry per row with the id and name from the table. You have to use AsEnumerable() [contained in System.Data.DataSetExtensions].
Convert the LINQ result set to a List via .ToList() and bind it to the GridView.
If you can post a little more information - what exactly you mean by display, and the expected shape of the returned DataTable (i.e., what the columns in each row are) we can give you a better answer.
UPDATE
If you're storing the XML document above in your datastore and that is being returned in the table, try this code:
DataTable FBWorkDt = objBLL.get_FacebookWork(FacebookUserId);
XDocument xDoc = XDocument.Load(FBWorkDt.Rows[0][0].ToString());
var query = from x in xDoc.Descendants("employer")
select new
{
id = (string)x.Element("id"),
name = (string)x.Element("name")
}
GrdWork.DataSource = query.ToList();
GrdWorkPub.DataBind();
Same basic principal as above, except this time your querying over an XDocument instead of a DataTable.