I have seen some code like this (not on consecutive lines, but in the same function scope):
Dim con1 As SqlConnection
con1 = New SqlConnection
'More code here
con1 = New SqlConnection
con1 = Nothing
I believe this is just a bug, but I wanted to check that there is not a form of shadowing going on here that I am unaware of. What happens to the first con1 variable? I assume it is inaccessible as there is no reference to the object.
What's Happening Here
con1 points to two different objects during the lifetime of that function.
The first object, created by the first
con1 = New SqlConnection
is no longer referenced after the second
con1 = New SqlConnection
is executed.
Is this a memory leak?
No. The object that is no longer referenced will eventually be disposed of, then the GC decides to do so. However it is a resource leak. Every time you fail to close a SQL Connection (assuming it was opened and not just allocated), you leave a resource unavailable for reuse. The GC will trigger when memory is low, so you will certainly regain the unreferenced object's memory at the latest when the system is low on memory (you will also regain the DB connection at that point). However, low resources will not trigger GC. You could run completely out of DB connections before the GC ever decides to kick in and release the SqlConnection objects (including the DB connections they were hoarding).
Fixing the Code
Since SqlConnection must be closed to release the connection, the first object will hang around until the GC decides to dispose of it. That is a bad thing, as SQL connections are a resource that should only be held as long as necessary.
Calling Close() the first connection before assigning the new SqlConnection object would improve the situation (also, call Close() on the second instance before leaving variable scope).
However, if you get an Exception somewhere in the code, without proper exception handling, you would still be left with undisposed objects until the GC kicks in. Always, always put exception handling around anything that manages a resource such as this.
The very best way to put exception handling in place for this scenario is with the Using keyword. I have never written a line of VB.Net until now, but here's my attempt at a correct version of your code:
Dim con1 As SqlConnection
Using con1
con1 = New SqlConnection
End Using
'More code here
Using con1
con1 = New SqlConnection
End Using
' NOTE: I believe the following is unnecessary, but was necessary in VB6 and prior
' con1 = Nothing
Related
i'm trying to loop over a datatable with more then 100 000 row using the Parrallel For each. Everything work fine up to around 25 000 iterations. I dont get any error, and I see the apps still working, but it kind of block and nothing happen. I tried to encapsulate the loop in a factory.startnew and I get a random abort expection at around 5000 iterations for no reason.
Dim lstExceptions As New ConcurrentQueue(Of Exception)
Dim options As New ParallelOptions
options.MaxDegreeOfParallelism = 3
Parallel.ForEach(ReservationReportDS.Tables(0).AsEnumerable(), options,
Sub(row)
Try
Dim tmpRow As DataRow = CType(row, DataRow)
Dim ReservationID As Integer = tmpRow.Field(Of Integer?)("autNoReservation")
Dim customerID As Integer = tmpRow.Field(Of Integer?)("CustomerID")
Dim VehiculeID As Integer = tmpRow.Field(Of Integer?)("autNoVehicule")
Dim bill As New BillingPath()
bill.Calculate_Billing(ReservationID, customerID, VehiculeID)
Catch err As Exception
lstExceptions.Enqueue(err)
End Try
End Sub
)
If (lstExceptions.Count > 0) Then
Throw New AggregateException(lstExceptions)
End If
Catch errAgg As AggregateException
For Each ex As Exception In errAgg.InnerExceptions
Log(Log_Billing_UI, "", System.Reflection.MethodBase.GetCurrentMethod().Name & GetExceptionInfo(ex))
Next
Catch ex As Exception
Log(Log_Billing_UI, "", System.Reflection.MethodBase.GetCurrentMethod().Name & GetExceptionInfo(ex))
End Try
Since you have such amount of records, I would like to recommend you to think about following concept:
Read all records into ConcurrentQueue(Of SomeBillingInfoClass) collection first - it will allow you to not keep connection to DB opened, make thread-safe rest operations with data readed from DB.
Create list of Tasks with Billing calc code inside. This will allow you to run tasks in parallel and pass ConcurrentQueue variable from #1 easily.
Keep tasks running in loop while at least one element in ConcurrentQueue remains.
In case you can aggregate billing calculation result to some other class - you may do it using additional thread safe ConcurrentQueue(Of BillingCalcResultInfoClass) collection.
After all billings are calculated - write to DB in single thread and single long transaction - this may be faster then granular writing to db.
Some notes about your code - I think you may not need to throw AggregateException manually - .Net environment will do it for you automatically. You only will need to catch it in .ContinueWith() method of task (sorry, mostly I'm c# developer and use c# notation).
I used similar approach to process millions of records and it works fine. Typically I use 3-5 tasks. But you can always study how much tasks you may have.
Using ConcurrentQueue or similar thread safe collection will allow you to keep your code thread safe more easily.
Please let me know if you have any questions.
Thank you all for your answers and especially Anton Norko. I finally found the problem and it was on my side. Under certain condition, Calculate_Billing was stuck in an infinite loop. Since I used 3 threads at the same time, they were getting stuck one by one.
I have some legacy code I've inherited and am trying to figure out whether or not this code is causing a connection to stay open to my database. I'm not too familiar with ADO.net but from what I can see this should be causing a problem.
I have a function to get a data reader:
Public Shared Function ExecuteDataReader(ByRef ObjSqlCmd As SqlCommand, ByVal CSTRING As String) As IDataReader
Try
Dim cn As New SqlConnection(CSTRING)
cn.Open()
ObjSqlCmd.Connection = cn
Return ObjSqlCmd.ExecuteReader(CommandBehavior.CloseConnection)
Catch ex As Exception
If ObjSqlCmd.Connection.State = ConnectionState.Open Then ObjSqlCmd.Connection.Close()
Return Nothing
End Try
End Function
And then code using this to read:
Dim cmd As New SqlCommand
'set up command
Dim idr As IDataReader = ExecuteDataReader(cmd, "database")
If idr Is Nothing Then
If cmd.Connection.State = ConnectionState.Open Then cmd.Connection.Close()
Return arrResult
End If
While idr.Read
'read
End While
If cmd.Connection.State = ConnectionState.Open Then cmd.Connection.Close()
I can see it closing the connection at the end (assuming nothing goes wrong and an exception is thrown) but from MSDN they say you should always clsoe the data reader. Does that get closed when the connection is closed?
Also from what I understand, the code
ObjSqlCmd.ExecuteReader(CommandBehavior.CloseConnection)
Will not close the connection unless the data reader is closed. Can someone explain what is happening here and if everything might be closing correctly? I know the best practice is to use "using" and then try/catch/finally to close the reader, the the original developers did not seem to follow these practices.
Yes it will. Any exception that occurs while reading from the data reader will bypass the closing statement. At some point, the garbage collector will kick in and dispose and release the connection back to the connection pool.
Until that point though, the connection can't be used. Worse, any locks acquired while executing the command will remain until the connection is closed.
Using ... using isn't a best practice just because. It's actually a lot simpler and safer to manage commands and connections in this way
Aim: Calling a very slow stored procedure (with parameters) asynchronously from code behind of an asp.net web page via single function call, and then forgetting about it.
Notes: I tried using SqlCommand.BeginExecuteNonQuery without calling SqlCommand.EndExecuteNonQuery (see the code below), but the stored procedure didn't run. (I used a tiny stored procedure to update single field on a table for testing but the field was not updated. It gets update when I use SqlCommand.ExecuteNonQuery).
I am not interested in and don't know when the stored procedure will end. (So I can't wait for it to finish to call SqlCommand.EndExecuteNonQuery.)
Situation:
After some research, I found out that sql jobs can be used for this purpose. And for the sp parameters; I can store them in a table and then sp can read them. However I don't know if this is the right approach to my problem (I am new to SQL world). I hope you can comment on usage of an sql job for this purpose, or suggest alternatives. Thanks.
Code:
using (SqlConnection connection = new SqlConnection(
#"Data Source=XXX\MSSQL2008R2;Initial Catalog=YYY;Integrated Security=True"
+ ";Asynchronous Processing=true"))
{
connection.Open();
SqlCommand cmd = new SqlCommand("spTestProcedure", connection);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#pText", DateTime.Now.ToString()));
cmd.BeginExecuteNonQuery(); //Doesn't work.
//cmd.ExecuteNonQuery(); // This works.
}
I think you should simply execute your sp in a separate thread.
http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx
Use ThreadPool for example to make a sync call on a separate thread.
It will looks something like this...
Extract method:
private void ExecuteCommandSync()
{
using (SqlConnection connection = new SqlConnection(
#"Data Source=XXX\MSSQL2008R2;Initial Catalog=YYY;Integrated Security=True"
+ ";Asynchronous Processing=true"))
{
connection.Open();
SqlCommand cmd = new SqlCommand("spTestProcedure", connection);
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add(new SqlParameter("#pText", DateTime.Now.ToString()));
cmd.ExecuteNonQuery();
}
}
Change your code:
ThreadPool.QueueUserWorkItem((x) => { ExecuteCommandSync(); });
It will make a synchronous call on some thread from ThreadPool, once it is done - it will close connection and you are done.
It is not the BEST solution performance-wise, because you will have a thread sleeping while it waits for the Stored Proc, but it is good enough and will do what you want.
I was looking for a way of doing this from the code behind of asp.net, but realized there is no easy way of doing it (I didn't want to think about connection or time out problems). I ended up doing it via a web service.
From asp.net code behind, I call my webs service's function synchronously.
Within the web service's function, I call the stored procedure asynchronously using SqlCommand.BeginexecuteNonQuery(AsyncCallback, Object).
The callback is handled within the web service (for error handling).
Hence my web page keeps working the way I want: Fire the request once, then forget about it.
i've a huge problem.
Take a look to this sample code
private sub FUNCTION1()
dim conn as new mysqlconnection(myconnstring)
conn.open
---- do something
FUNCTION2()
----
conn.close
end sub
private sub FUNCTION2()
dim conn as new mysqlconnection(myconnstring)
....
conn.open
-- do something
conn.close
end sub
Despite i close regulary all my connection, they remains "open" on the mysql server.
I know this because i'm using MySQL Administration tool to check how many connection i open "line by line" of my source code.
In fact i get a lot of time "user has exceeded the 'max_user_connections' resource (current value: 5)"
My hoster permit ONLY 5 connection, but i think that if i write GOOD source code, this cannot be a problem.
So my question is: why that "damn" connections remain open ??
Thank you in advance!
Consider wrapping your MySqlConnection operations in a Using statement...
Any object that you instantiate within that Using ... new statement will be disposed of properly by the compiler. Once that End Using statement appears, the object goes out of scop. Any objects that are declared with in the Using block will need to be disposed of by the developer, as per normal.
Using conn As New MySqlConnection(myConnString)
Using cmd As New MySqlCommand("SELECT * FROM Employees", conn)
conn.Open()
Using rdr As MySqlDataReader = cmd.ExecuteReader()
While rdr.Read()
Console.WriteLine(rdr(0))
End While
End Using
End Using
End Using
In this case, you don't HAVE to encase your Command and Reader in their own Using, but it allows you to take advantage of the fact that they all implement IDisposable.
Try calling conn.Dispose() and then setting it to null.
On another note, are you opening a connection in Function1 and Function2 also as it appears that you're creating 2 connections. If that's the case try to implement a singleton pattern to manage your connections.
in ASP.NET, I have seen people coding like this
using (SqlConnection myConnection = new SqlConnection(AppConfiguration.ConnectionString))
{
// Do the datatbase transactions
}
How does it differ from this one
SqlConnection myConnection = new SqlConnection(AppConfiguration.ConnectionString)
// Do the datatbase transactions
Is there any performance/speed improvements in using one over the other ?
The using statement allows the
programmer to specify when objects
that use resources should release
them. The object provided to the using
statement must implement the
IDisposable interface. This interface
provides the Dispose method, which
should release the object's resources.
A using statement can be exited either
when the end of the using statement is
reached or if an exception is thrown
and control leaves the statement block
before the end of the statement.
A good article can be found here
Understanding the 'using' statement in C#
It's just a shortcut. :)
using (var foo = new Foo())
foo.bar();
equals to:
Foo foo = new Foo();
try
{
foo.bar();
}
finally
{
if (foo != null)
((IDisposable)foo).Dispose();
}
using keyword ensures that object will be disposed (it must implement IDisposable).
It's useful when working with external resources (database connection, file streams etc.) - they will be released despite of errors that may occur.
Using: Defines a scope, outside of which an object or objects will be disposed.
So its just syntax instead of creating the object then disposing it.
More details on MSDN
objects created in the using construct have lifetime within the construct (the braces {}). And if you notice only those members which implement the IDisposable interface can only be created. It simply means that right after the using construct code the the compiler will automatically call the dispose method on the object you've created. It helps in garbage collection.
As in the case of SqlConnection objects we must call dispose otherwise the ado.net connection pool (which manages connections to database) will assign a new connection to another incoming request instead of reusing an old connection pool. Connection pooling is simply an inherent way to minimize the resources & time taken to acquire a connection to a database.
Refer:
Connection Pooling
IDisposable
using statement is a shortcut for
try
{
/* do the work here with obj */
}
finally
{
if (obj != null)
obj.Dispose();
}
In general, you should use the using syntax for any object that implements IDisposable - for example, SqlConnection.
The using statement ensures that the object is disposed correctly at the end of the block in (almost) all circumstances, even if exceptions occur.
There aren't any direct speed/performance differences either way, but if you don't dispose of your IDisposable objects (usually by using using) then you might encounter problems down the line because vital resources haven't been tidied up.
Using using is good practice and almost always the right way to do things. Not using using is usually a bad idea, unless you're absolutely certain of what you're doing and why.
the using statement would automatically call the finally, i.e once the using block is finished, it will dispose off the object.
using (SqlConnection myConnection = new SqlConnection(AppConfiguration.ConnectionString))
{
// Do the datatbase transactions
}
is equivalent to:
SqlConnection myConnection = new SqlConnection(AppConfiguration.ConnectionString)
// Do the datatbase transactions
myConnection.Close();
myConnection.Dispose()