Good evening!
I stil have a doubt asp.net connection pooling. I work on a application that sometimes throws the exception "max pool size was reached".
My team looked over and over for some code that was leaking but nothing was found.
But, now it comes my doubt. When it says "max pool size was reached", does it mean that max pool size was reached for the database or for the server?
If SQL Server hosts several databases for several differents asp.net applications, can these others databases interfere in my database (my database is in the same SQL Server). E.g., if there is some application leaking connection, can this leaking generates "max pool size" in my application?
Thanks!
ADO.NET maintains separate connection pool groups per connection string. So if you have multiple databases, they should have their own pools, and those pools should not interfere with each other.
Is it possible that some requests are taking a long time? If enough requests are executed at once against a single database, and they are delayed, maybe the connection pool really is being reached simply because of open connections.
To verify this is not the case, you might check what is running on the database by executing sp_who or by running SQL Server Activity Monitor.
You can also query the DMVs in the database to see how many connections various programs have open by database:
select
NULL as [Connections by Database],
[host_name] as [Client Machine],
DB.name as [Database],
[program_name] as [Client Program],
COUNT(*) as [Open Connections]
from sys.dm_exec_sessions s (nolock)
left join sys.dm_exec_requests r (nolock)
on r.session_id = s.session_id
left join sys.databases DB (nolock)
on DB.database_id = r.database_id
where s.session_id >= 50 -- Ignore SQL Server processes
group by [host_name], DB.name, [program_name]
order by [Client Machine], [Database], [Client Program]
If you did find out that you just need more connections, you can tweak the limit in the connection string by setting the property Max Pool Size to something other than 100. Here is an example.
It is possible to dig through the .NET objects in a debugging heap if you want to see what pool is causing the problem. You would have to capture a memory dump of the w3wp.exe process and analyze it with a tool like WinDbg (or possibly Debug Diagnostics Tool). I have done this in the past. It is not necessarily easy, but it can help a lot.
EDIT
There is a perfmon counter for ADO.NET connection pooling that you can use to monitor leaking connections. In Performance Monitor, cxpand .NET Data Provider for SqlServer and add the counter NumberOfReclaimedConnections. According to the documentation, this counter means:
The number of connections that have been reclaimed through garbage
collection where Close or Dispose was not called by the application.
Not explicitly closing or disposing connections hurts performance.
We have used this counter to verify our application is leaking connections.
Related
In my IIS Server, I have many application pools (like 6 to 7) and there are many ASP.NET applications running on each of them (ex. 25 applications per pool). They all are connected with Oracle database by using ADO.NET.
All applications are just working fine, but sometimes we get error like
Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached.
I know the possibilities for this like we are not closing our database connections properly. So here is my headache... I don't want to go each and every project to see where we forget to close connections it is very time taking task for us.
So is there any way to identify from which application connections are remaining opened? Can we see from IIS itself? Can we make some kind of utility to track from which project connection are remaining opened?
I'm not sure that it's a probleme of the connection to database. I think that you application are not disposing the context then the garbage collector can't clear memory. you can try to reduce the time for recylcling your application pools then check if you memory usage is decreasing or not.
We are using NHibernate with OpenSessionInView pattern for our AspNet webapp.
Using ADO connection (SqlServer) we want to log in a different database every acces to pages. For that, do we need to open a connection at every "page load", execute the insert, then close the connection, or can we keep the same connection shared among all requests?
What about locks and concurrent access? We do only insert on this database.
Yes, I'd go with open --> insert --> close. The reason being that SQL Connections -and most DB connections, depending on the driver- are pooled so opening a new connection really implies getting a connection from the pool, which is inexpensive (unless you are running out of connections in the pool). If on the other hand you hold on to an open connection, you'll end up with a TON of concurrency issues since you'll have to synchronize the access to this connection object for every request. A nightmare, in other words. In fact, you'll be blocking your request and slowing things down considerably.
Again, you are not really improving the performance -quite the contrary- and you are complicating your app.
I am creating a .net application, with a Sql Server db engine. I would like my site to be accessed by thousands of users per second. What does the number of connections rely on?
How many connection can IIS hold, and Sql Server?
First, there is a difference between connections and connection pools. Is it good to look into that, as it makes a huge difference in performance. If you need a reference, I can dig one up, but google/bing is your friend here. The key takeaway is: keep the number of connections pool to a minimum.
With that said, the number of connection depends on two things.
Are you using Windows Auth? If so, every distinct user will cause a distinct connection/connection pool
If you are using SQL Auth, then each different connection string will cause a new pool to be created (even a single space difference will cause a new pool).
In regards to the scaling question, both IIS and SQL Server can handle an very high number of connections. If you are running into connection limits, you should probably take a look at the application design.
Erick
The number of connections is really dependent on the physical makeup and optimization of your server and how far you can push it. You can down-throttle the number of concurrent connections in the IIS configuration as well as in SQL if you want to put a limit on how many connections should be allowed.
That is no problem for Windows and or SQL Server:
Windows is configured by default to handle 1000 to 2000 concurrent tcpip connections.
For SQL Server, it will also depend on the licenses and or hardware which you did not specify. What kind of hardware is SQL going on ?
You can set a limit yourself, or find out what limit has already been put in place in your IIS server settings.
http://www.microsoft.com/technet/prodtechnol/WindowsServer2003/Library/IIS/b2b550de-f655-4fb6-9bed-dfc9583b6700.mspx?mfr=true
Does the same connection string used on two different physical servers hosting different web applications that talk to the same database draw connections from the same connection pool? Or are pooled connections confined to at the application level?
I ask because I inherited a 7 year old .NET 1.1 web application which is riddled with in-line SQL, unclosed and undisposed sql connection and datareader objects. Recently, I was tasked to write a small web app that is hosted on another server and talks to the same database and therefore used the same database connection string. I created a LINQ object to read and write the one table required by the app. Now the original .NET 1.1 app is throwing exceptions like
"Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was reached."
Maybe these are unreleated, but wanted to get your opinions to make sure I cover all my bases.
Thanks!
There is no way connections can be pooled between two separate machines. Your SQL Server will have a connection limit for total connections however.
This error is most likely occurring because the application is not returning connections to the connection pool. This can happen because the connection is not being disposed of correctly. This can happen due to poor code (does it use a using block, or a try catch finally?) or if using a SQLDataReader can cause the connection to stay open after the code to execute the SQL has exited.
Connection Pools are kept in your App Pool, so it shouldn't be possible for a separate machine to steal out of a separate boxes App Pool. Have a look here for some info on the connection pool. I'd also recommend slapping the performance counters on see bottom of this article to see what's going on in there a bit more.
Also might want to check the max number of connections on SQL Server. In management Studio
Right click on the Server name --> Properties --> Connections
look for "Maximum number of concurrent connections (0 = unlimited)"
What is the best way to handle connection pooling with Oracle 11g and asp.net, I'm having issues where Oracle refuses to open up any new connections for the web app after a while.
This causes the request to time out and queue up.!
EDIT:
Is there anything that I need to do in Oracle to fine tune this?
Since you didn't mention your Oracle config, its hard to tell you a first course of action, so you need to clarify how many sessions you have.
SELECT username, count(1) FROM v$session GROUP BY username;
Oracle's max is controlled by the "PROCESSES" instance parameter. The default may be something like 150. You may try bumping that to 300 or so for an OLTP web app, however, if you do have a leak, it will only delay the inevitable. But check the PROCESSES is at least as large as your "Max Pool Size" setting for your Oracle ADO connection string. Default for 11g ODP.NET is 100 I think.
Closing the connections is all you need to do. The framework should handle all of the pooling.
Querying the v$session would show all outstanding sessions.
How many connections do you have and how quickly are you trying to create/disconnect them ?
Shared servers is one mechanism to have multiple end clients share a limited number of connections.