ASP.NET MySQL update multiple records - asp.net

I have a web page that needs to update multiple records. This page gets all the information and then begins a transaction sending multiple UPDATE queries to the data base.
foreach row
{
Prepare the query
Hashtable Item = new Hashtable();
Item.Add("Id", Id);
Item.Add("Field1", Field1);
Item.Add("Field2", Field2);
Item.Add("Field3", Field3);
...
}
Then we launch the ytransaction
DO CHANGES()
public void execute_NonQuery_procedure_transaction(string StoredProcedure, List<Hashtable> Params)
{
using (MySqlConnection oConnection = new MySqlConnection(ConfigurationManager.AppSettings[DB]))
{
MySqlTransaction oTransaction;
bool HasErrors = false;
oConnection.Open();
oTransaction = oConnection.BeginTransaction();
try
{
MySqlCommand oCommand = new MySqlCommand(StoredProcedure, oConnection);
oCommand.CommandType = CommandType.StoredProcedure;
oCommand.Transaction = oTransaction;
foreach (Hashtable hParams in Params)
{
oCommand.Parameters.Clear();
IDictionaryEnumerator en = hParams.GetEnumerator();
while (en.MoveNext())
{
oCommand.Parameters.AddWithValue("_" + en.Key.ToString(), en.Value);
oCommand.Parameters["_" + en.Key.ToString()].Direction = ParameterDirection.Input;
}
oCommand.ExecuteNonQuery();
}
}
catch (Exception e)
{
HasErrors = true;
throw e;
}
finally
{
if (HasErrors)
oTransaction.Rollback();
else
oTransaction.Commit();
oConnection.Close();
}
}
}
Is there another way to do this or this is the most efficient way?

It depends on the situation, like if you have multiple row updates or adding new rows or deleting some rows or a combination of these, which modifies the database table then, the efficient way to do this is to have Batch Update...
Please go through this link Batch Update
Hope this helps...

it looks fine to me, you could eventually do not clear the Command.Parameters list but just assign the values on following iterations but probably this leads to no visible improvements.
pay attention your throw is wrong, in C# don't use throw e; but simply throw;.

Related

how can i bind an object to a gridview

My code is
public Emp GetEmpByEmpno(int empno)
{
using (con)
{
if (con.State == ConnectionState.Closed)
{
con.ConnectionString = constr;
con.Open();
}
cmd.CommandText = "sp_emp_GetempByEmpno";
cmd.Parameters.Clear();
cmd.Parameters.AddWithValue("#eno",empno);
dr=cmd.ExecuteReader();
Emp obj=null;
while(dr.Read())
{
obj=new Emp();
obj.Empno=int.Parse(dr["Empno"].ToString());
obj.Ename=dr["Ename"].ToString();
obj.Sal=dr["Sal"].ToString();
obj.Deptno=int.Parse(dr["Deptno"].ToString());
}
return obj;
}
}
Here I fetch the record based on employee number, whenever i pass empno in textbox search button onClick, the respective employee should display in grid view. How can i bind the object to grid view?
Employee obj=EmpDeptBus.GetEmployeeByEmpno(int.Parse(txtEmpno.Text));
gvemp.DataSource = c;
gvemp.DataBind();
You should be able to just say
gvemp.DataSource = obj;
That's really all you need to do to bind the object.
Also, change your
while(dr.Read())
to
if(dr.Read())
You're only expecting one record so only fetch one. Also put your return obj outside your using to make sure everything is properly disposed before you return to the calling function.
Try making sure that txtEmpno.Text holds an int value before you attempt to pass it to this method or it will blow up. Never, ever trust user input. You could do something like:
int empNo = 0;
if(int.TryParse(txtEmpNo.Text.Trim(), out empNo)
{
// then call the function and bind your grid using the empNo as the
// variable holding the employee number.
}
else
{
// otherwise handle the fact that the user entered a non-numeric.
}

Adding value to dropdownlist

I have web Service method that returns value of one column of a table. I want add that value to my drop down list. Is there any easy way to do it.
Here is my web method that returns all conference_name in conference table.
[WebMethod(Description = "Retrieves all Conference")]
public DataSet GetAllConference()
{
DataSet dataSet = new DataSet();
// Create connection object
OleDbConnection oleConn = new OleDbConnection(connString);
try
{
oleConn.Open();
string sql = "SELECT conference_name FROM Conference";
OleDbDataAdapter dataAdapter = new OleDbDataAdapter(sql, oleConn);
dataAdapter.Fill(dataSet, "Conference");
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
finally
{
oleConn.Close();
}
if (dataSet.Tables.Count <= 0)
return null;
else
return dataSet;
}
On user side there will be one dropdown list. How can i add the value return by the web method to the dropdown list.
You need to add web reference of web service in order to call the method, You may read this post to know how add reference, After adding reference you can call the method adn fill the drop down by the code given below,
DataSet ds = wsObject.GetAllConference();
if(ds.Tables.Count > 0)
{
ddlist.DataTextField = "conference_name";
ddlist.DataValueField = "conference_name"; //Change field to one you want.
//ddlist.DataValueField = "IDColumnInTheDataTable"; //un comment after give right column name
ddlist.DataSource = ds.Tables[0];
ddlist.DataBind();
}

Executing stored procedure with asp.net

I am trying to execute a stored procedure in asp.net. The stored procedure requires 3 parameters, all 3 are ID's(ints). The 3 parameters are : TaskID, ExhibitID, and InvestigatorID.
I have a hidden field that contains an array of ExhibitID's that came from a javascript function.
My question is how do I get the query to execute as I am looping through the array?
Here is an example of my stored procedure:
var cnSaveTask = new SqlConnection(ConfigurationManager.ConnectionStrings["OSCIDConnectionString"].ToString());
var comLinkExhibitToTask = new SqlCommand("p_CaseFileTasksExhibitLinkAdd", cnSaveTask) { CommandType = CommandType.StoredProcedure };
foreach (string exhibit in hidExhibitsIDs.Value.Split(','))
{
comLinkExhibitToTask.Parameters.AddWithValue("#TaskID", taskID);
comLinkExhibitToTask.Parameters.AddWithValue("#ExhibitID", Convert.ToInt32(exhibit));
comLinkExhibitToTask.Parameters.AddWithValue("#InvestigatorID", int.Parse(Session["InvestigatorID"].ToString()));
}
try
{
cnSaveTask.Open();
comLinkExhibitToTask.ExecuteNonQuery();
}
It is not working in my DB though. Nothing gets added. My guess is that since it is iterating and not executing, it just keeps replacing the "exhibitID" everytime then eventually tries to execute it. But I don't think just adding "comLinkExhibitToTask.ExecuteNonQuery()"
outside the try is a good idea. Any suggestions?
you can either move the try block into the foreach loop or wrap the foreach loop with a try block. (depending on what error handling you wish - continue with the next exhibit on error or completely abort execution)
I've never used AddWithValue, so I can't speak to its functionality. Here's how I typically write a DB call like this.
using (SqlConnection cnSaveTask = new SqlConnection(ConfigurationManager.ConnectionStrings["OSCIDConnectionString"].ConnectionString))
{
cnSaveTask.Open();
using (SqlCommand comLinkExhibitToTask = new SqlCommand("p_CaseFileTasksExhibitLinkAdd", cnSaveTask))
{
comLinkExhibitToTask.CommandType = CommandType.StoredProcedure;
comLinkExhibitToTask.Parameters.Add(new SqlParameter("#TaskID", SqlDbType.Int) {Value = taskID});
// etc.
comLinkExhibitToTask.ExecuteNonQuery();
}
}
The solution:
var cnSaveTask = new SqlConnection(ConfigurationManager.ConnectionStrings["OSCIDConnectionString"].ToString());
try
{
var comLinkExhibitToTask = new SqlCommand("p_CaseFileTasksExhibitLinkAdd", cnSaveTask) { CommandType = CommandType.StoredProcedure };
cnSaveTask.Open();
comLinkExhibitToTask.Parameters.Add(new SqlParameter("#TaskID", SqlDbType.Int));
comLinkExhibitToTask.Parameters.Add(new SqlParameter("#ExhibitID", SqlDbType.Int));
comLinkExhibitToTask.Parameters.Add(new SqlParameter("#InvestigatorID", SqlDbType.Int));
foreach (string exhibit in hidExhibitsIDs.Value.Split(','))
{
comLinkExhibitToTask.Parameters["#TaskID"].Value = taskID;
comLinkExhibitToTask.Parameters["#ExhibitID"].Value = Convert.ToInt32(exhibit);
comLinkExhibitToTask.Parameters["#InvestigatorID"].Value = int.Parse(Session["InvestigatorID"].ToString());
comLinkExhibitToTask.ExecuteNonQuery();
}
}
catch (Exception ex)
{
ErrorLogger.Log(0, ex.Source, ex.Message);
}
finally
{
if (cnSaveTask.State == ConnectionState.Open)
{
cnSaveTask.Close();
}
}
Since I was in a loop it kept adding parameters. So just declare the parameters outside the loop, and only pass the values in the loop. That way there are only 3 parameters, and the values will be passed in accordingly

Raven DB DocumentStore - throws out of memory exception

I have code like this:
public bool Set(IEnumerable<WhiteForest.Common.Entities.Projections.RequestProjection> requests)
{
var documentSession = _documentStore.OpenSession();
//{
try
{
foreach (var request in requests)
{
documentSession.Store(request);
}
//requests.AsParallel().ForAll(x => documentSession.Store(x));
documentSession.SaveChanges();
documentSession.Dispose();
return true;
}
catch (Exception e)
{
_log.LogDebug("Exception in RavenRequstRepository - Set. Exception is [{0}]", e.ToString());
return false;
}
//}
}
This code gets called many times. After i get to around 50,000 documents that have passed through it i get an OutOfMemoryException.
Any idea why ? perhaps after a while i need to declare a new DocumentStore ?
thank you
**
UPDATE:
**
I ended up using the Batch/Patch API to perform the update I needed.
You can see the discussion here: https://groups.google.com/d/topic/ravendb/3wRT9c8Y-YE/discussion
Basically since i only needed to update 1 property on my objects, and after considering ayendes comments about re-serializing all the objects back to JSON, i did something like this:
internal void Patch()
{
List<string> docIds = new List<string>() { "596548a7-61ef-4465-95bc-b651079f4888", "cbbca8d5-be45-4e0d-91cf-f4129e13e65e" };
using (var session = _documentStore.OpenSession())
{
session.Advanced.DatabaseCommands.Batch(GenerateCommands(docIds));
}
}
private List<ICommandData> GenerateCommands(List<string> docIds )
{
List<ICommandData> retList = new List<ICommandData>();
foreach (var item in docIds)
{
retList.Add(new PatchCommandData()
{
Key = item,
Patches = new[] { new Raven.Abstractions.Data.PatchRequest () {
Name = "Processed",
Type = Raven.Abstractions.Data.PatchCommandType.Set,
Value = new RavenJValue(true)
}}});
}
return retList;
}
Hope this helps ...
Thanks alot.
I just did this for my current project. I chunked the data into pieces and saved each chunk in a new session. This may work for you, too.
Note, this example shows chunking by 1024 documents at a time, but needing at least 2000 before we decide it's worth chunking. So far, my inserts got the best performance with a chunk size of 4096. I think that's because my documents are relatively small.
internal static void WriteObjectList<T>(List<T> objectList)
{
int numberOfObjectsThatWarrantChunking = 2000; // Don't bother chunking unless we have at least this many objects.
if (objectList.Count < numberOfObjectsThatWarrantChunking)
{
// Just write them all at once.
using (IDocumentSession ravenSession = GetRavenSession())
{
objectList.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
return;
}
int numberOfDocumentsPerSession = 1024; // Chunk size
List<List<T>> objectListInChunks = new List<List<T>>();
for (int i = 0; i < objectList.Count; i += numberOfDocumentsPerSession)
{
objectListInChunks.Add(objectList.Skip(i).Take(numberOfDocumentsPerSession).ToList());
}
Parallel.ForEach(objectListInChunks, listOfObjects =>
{
using (IDocumentSession ravenSession = GetRavenSession())
{
listOfObjects.ForEach(x => ravenSession.Store(x));
ravenSession.SaveChanges();
}
});
}
private static IDocumentSession GetRavenSession()
{
return _ravenDatabase.OpenSession();
}
Are you trying to save it all in one call?
The DocumentSession need to turn all of the objects that you pass it into a single request to the server. That means that it may allocate a lot of memory for the write to the server.
Usually we recommend on batches of about 1,024 items in you are doing bulks saves.
DocumentStore is a disposable class, so I worked around this problem by disposing the instance after each chunk. I highly doubt this is the most efficient way to run operations, but it will prevent significant memory overhead from happening.
I was running a sort of "delete all" operation like so. You can see the using blocks disposing both the DocumentStore and the IDocumentSession objects after each chunk.
static DocumentStore GetDataStore()
{
DocumentStore ds = new DocumentStore
{
DefaultDatabase = "test",
Url = "http://localhost:8080"
};
ds.Initialize();
return ds;
}
static IDocumentSession GetDbInstance(DocumentStore ds)
{
return ds.OpenSession();
}
static void Main(string[] args)
{
do
{
using (var ds = GetDataStore())
using (var db = GetDbInstance(ds))
{
//The `Take` operation will cap out at 1,024 by default, per Raven documentation
var list = db.Query<MyClass>().Skip(deleteSum).Take(5000).ToList();
deleteCount = list.Count;
deleteSum += deleteCount;
foreach (var item in list)
{
db.Delete(item);
}
db.SaveChanges();
list.Clear();
}
} while (deleteCount > 0);
}

SQL CE 3.5 problem with TableDirect table access

I try to insert hundreds of records into empty database table using TableDirect type of SqlCeCommand. The problem is I get an exception SqlCeException "Unspecified error" when calling SqlCeResultSet::Insert. Below is my code. Any hints?
Thanks
public bool StoreEventsDB2(List<DAO.Event> events)
{
try
{
SqlCeCommand command = new SqlCeCommand("Event");
command.CommandType = System.Data.CommandType.TableDirect;
SqlCeResultSet rs = _databaseManager.ExecuteResultSet(command, ResultSetOptions.Updatable | ResultSetOptions.Scrollable );
foreach (DAO.Event theEvent in events)
{
SqlCeUpdatableRecord record = rs.CreateRecord();
record.SetInt32( 0, theEvent.ID );
record.SetInt32( 1, theEvent.ParentID);
record.SetString(2, theEvent.Name);
record.SetDateTime(3, theEvent.DateTime);
record.SetDateTime(4, theEvent.LastSynced);
record.SetInt32(5, theEvent.LastSyncedTS);
record.SetString(6, theEvent.VenueName);
record.SetBoolean(7, theEvent.IsParentEvent);
record.SetDateTime(11, DateTime.Now);
rs.Insert(record);
}
}
catch (SqlCeException e)
{
Log.Logger.GetLogger().Log(Log.Logger.LogLevel.ERROR, "[EventManager::StoreEventsDB] error: {0}", e.Message);
return false;
}
catch (Exception e)
{
Log.Logger.GetLogger().Log(Log.Logger.LogLevel.ERROR, "[EventManager::StoreEventsDB] error: {0}", e.Message);
return false;
}
return true;
}
I am unsure how your connection is managed with the database manager which could be the culprit - make sure you are using one connection (sqlce doesn't play nice). Also the results set option "ResultSetOption.Scrollable" is not needed (at least I have never used it for an insert).
Below is the syntax I use when doing direct table inserts. Every database/data access object is wrapped in a using statement to dispose of objects after use - this is very important especially with the compact framework and sqlce as the garbage collection is less than ideal (you WILL get out of memory exceptions!). I have added a transaction to your code also so that the option is all or nothing.
Hope this helps:
using (var transaction = connection.BeginTransaction())
{
using (var command = connection.CreateCommand())
{
command.Transaction = transaction;
command.CommandType = CommandType.TableDirect;
command.CommandText = "Event";
using (var rs = command.ExecuteResultSet(ResultSetOptions.Updatable))
{
var record = rs.CreateRecord();
foreach (DAO.Event theEvent in events)
{
record.SetInt32(0, theEvent.ID);
record.SetInt32(1, theEvent.ParentID);
record.SetString(2, theEvent.Name);
record.SetDateTime(3, theEvent.DateTime);
record.SetDateTime(4, theEvent.LastSynced);
record.SetInt32(5, theEvent.LastSyncedTS);
record.SetString(6, theEvent.VenueName);
record.SetBoolean(7, theEvent.IsParentEvent);
record.SetDateTime(11, DateTime.Now);
rs.Insert(record);
}
}
transaction.Commit();
}
}

Resources