It's my first time of using MySql on ASP.Net.
Unlike MSSql which I use quite often, I've noticed that using MySqlConnection to connect to the db takes ages (I mean a second or two),
MySql.Data.MySqlClient.MySqlConnection connection = new MySqlConnection(DBConnectionString);
Therefore I would like to know how can I implement a connection pool, or whatever recommended structure that could store one connection object (MySqlConnection) to be used across the application.
Is there a common practice for doing so or any other recommendations ?
Here's the code I'm using - maybe I'm doing something wrong here ?
MySql.Data.MySqlClient.MySqlConnection connection = new MySqlConnection(DBConnectionString);
MySqlDataAdapter adapter = new MySqlDataAdapter();
if (connection.State != ConnectionState.Open)
{
try
{
connection.Open();
}
catch (MySqlException ex)
{
throw (ex);
}
}
MySqlCommand cmd = new MySqlCommand("SELECT this FROM that", connection);
DataSet ds = new DataSet();
adapter.SelectCommand = cmd;
adapter.Fill(ds);
cmd.Connection.Close();
According to the documentation it's on by default. Further, you're creating the instance with the connection string, so that's good, because it allows the connector to leverage the pool immediately. So, the fact that it's taking a second or two to create those connections is almost certainly unrelated to connection pooling and more related to the hardware you're providing MySQL in the environment you're working in.
The term hardware is really broad here because you could be dealing with everything from network to disk and memory related issues.
Do read the documentation - it does show you how to adjust the connection pooling - so that may help you. I make that statement because your question doesn't give us a lot of information surrounding exactly how you're using this server or these connections.
Related
We are migrating a project from .NET Framework to .NET Core, the project is working with multiple data bases that are on the same SQL server. In the past we used a transaction scope for any transaction that we wanted to roll back in case of an error.
When the transaction is involving multiple DBs the transaction is being promoted to a distributed transactions which is not supported in .NET Core.
Question is, if all DBs are actually on the same server, if I will use a 'cross-database queries' like is suggested at the very last part of this Answer will I be insured against such a scenario?
Does 'cross-database queries' simply means running raw-SQL commands like:
using(TransactionScope scope = new TransactionScope())
{
var connection = new SqlConnection(connectionString);
connection.Open();
var SqlComm1 = new SqlCommand("Insert into TableA...", connection);
SqlComm1 .ExecuteNonQuery();
var SqlComm2 = new SqlCommand("Insert into [DB2].[dbo].[TableB]...";
SqlComm2 .ExecuteNonQuery();
.
.
}
if not, can I get a code example of what it actually is?
lastly, while using 'cross-database queries' can I take advantage of anything from my actual DBContexts? like connections, dbSets or anything and if so, how?
I am creating a program in which I am trying to have a new database connection in an existing database connection in OLEDB using ASP.net. Is that even possible?
"Connection within connection" is likely not what you have in mind. But there is no issue with having both OleDb and SQLServer connections operating together in one ASP.NET application, you can add these 2 or many more connections to one app.
Answer is yes. Please make sure you close the connections after using by using statement or close() explicitly. More here.
using (OleDbConnection connection1 = new OleDbConnection(connectionString1))
using (OleDbConnection connection2 = new OleDbConnection(connectionString2))
{
connection1.Open();
connection2.Open();
// Do something
}
I am trying to run a non query using a Oracle connection in ASP C# with CLR 4.5. Here is my code:
string connectionString = ConfigurationManager.ConnectionStrings["OracleConnectionString1"].ConnectionString;
OracleConnection conn = new OracleConnection(connectionString);
conn.Open();
OracleCommand cmd = new OracleCommand();
cmd.Connection = conn;
cmd.CommandText = "update SALES_ADVENTUREWORKS2012.SALESORDERDETAIL set UNITPRICEDISCOUNT=0 where ROWGUID='4A399178-C0A0-447E-9973-6AB903B4AECD'";
cmd.CommandType = CommandType.Text;
cmd.CommandTimeout = QUERY_TIMEOUT;
int row_affected = cmd.ExecuteNonQuery();
HttpContext.Current.Response.Write("Rows affected:" + row_affected + "<br/>");
conn.Close();
when I run the query in oracle development tool, it works fine.
when I use the asp code above, it freezes when performing the query. It freezes forever even though I used a 5 second timeout.
I've tried using the managed and unmanaged oracle libraries; both behave the same.
Note that using the fill or scalar query work perfectly fine so there is nothing wrong with my connection string. Also the fact that oracle development can perform this update query proves that this is not a permission problem.
Any ideas?
Most likely your query is waiting to get access to the record. You probably have modified that row in "oracle development tool" and have not committed or rolled back that transaction.
Just commit/rollback in your tool or close open session.
You can check for open transactions in v$transaction view.
More on automatic locks in Oracle:
http://docs.oracle.com/cd/E11882_01/server.112/e41084/ap_locks001.htm
Are you certain you are using the 4.5 library? The 3.5 documentation states that the CommandTimeout property has no effect.
The 4.5 documentation suggests it should work, but the Remarks section doesn't mention the change, which warrants suspicion.
Otherwise, the code you posted doesn't seem to show where you actually set the value of QUERY_TIMEOUT to 5 seconds. If QUERY_TIMEOUT has a value of zero, then any other provider (SQLCommand, for example) would wait indefinitely. As vav suggested, locks from other sources could cause an indefinite wait.
I got an asp.net gridview connected to my sql database. When Inserting a new record or updating a record im doing some serverside checks and then either update/insert a record or do nothing. right now i got 2 methods CheckArtistExists and CheckSongExists which are both using a SqlConnection Object e.g.
public bool CheckSongExists(string _title, int _artistId)
{
int cnt = -1;
using (SqlConnection con = new SqlConnection(CS))
{
//check if song already is exists in DB
SqlCommand cmd = new SqlCommand("Select Count(ID) from tblSong WHERE Title = #newTitle AND ArtistId = #newArtistId;", con);
cmd.Parameters.AddWithValue(#"newTitle", _title);
cmd.Parameters.AddWithValue(#"newArtistId", _artistId);
con.Open();
cnt = (int)cmd.ExecuteScalar();
// if cnt ==1 song exists in DB, of cnt == 0 song doesnt exist
if(cnt == 1)
{ return true; }
else
{ return false; }
}
}
So for the Update function in the gridview i need to establish 3 SqlConnections (at max) one to check for the artist(if artist doesnt exist i have to insert a record to tblArtist first)
then a check if the song exists(only if artist exists) and finally if song doesnt exist I have to insert a new record.
I know database connections are valuable resources thats why i put them in a using block. So im not quite sure if its good style to use 3 SqlConnection objects to update/insert. Can you please tell me if my code is ok or if i should rather use another approach for this problem.
thank you
ADO.NET internally manages the underlying Connections to the DBMS in the ADO-NET Connection-Pool:
In practice, most applications use only one or a few different
configurations for connections. This means that during application
execution, many identical connections will be repeatedly opened and
closed. To minimize the cost of opening connections, ADO.NET uses an
optimization technique called connection pooling.
Connection pooling reduces the number of times that new connections
must be opened. The pooler maintains ownership of the physical
connection. It manages connections by keeping alive a set of active
connections for each given connection configuration. Whenever a user
calls Open on a connection, the pooler looks for an available
connection in the pool. If a pooled connection is available, it
returns it to the caller instead of opening a new connection. When the
application calls Close on the connection, the pooler returns it to
the pooled set of active connections instead of closing it. Once the
connection is returned to the pool, it is ready to be reused on the
next Open call.
So obviously there's no reason to avoid creating,opening or closing connections since actually they aren't created, opened and closed at all. This is "only" a flag for the connection pool to know when a connection can be reused or not. But it's a very important flag, because if a connection is "in use"(the connection pool assumes), a new physical connection must be openend to the DBMS what is very expensive.
So you're gaining no performance improvement if you "reuse" connections but the opposite.
Create, open(in case of Connections), use, close and dispose them where you need them(f.e. in a method)
use the using-statement to dispose and close(in case of Connections) implicitely
So yes, it's absolutely fine to use one connection per method since you are not using a physical connection at all if connection-pooling is enabled (default).
Another question is if you could improve your approach. You could create a stored-procedure which checks existence and updates or inserts accordingly.
Solutions for INSERT OR UPDATE on SQL Server
I'm writing a small little utility MVC app and I need to have the ability to execute ad-hoc queries against my one-table SQL Compact 4.0 .sdf file for management (Web Matrix isn't working right for development, and it won't be available on the PC this will ultimately be running on). Using Entity Framework code-first, everything is working fine, but to do an ad-hoc query, I figured I'd need to connect to it the way I would have in the pre-EF days (see below)
cn = new SqlConnection("Data Source=|DataDirectory|LocalScanData.sdf");
SqlCommand cmd = new SqlCommand(query, cn);
if (cn.State != ConnectionState.Open) cn.Open();
if (query.ToUpper().StartsWith("INSERT") || query.ToUpper().StartsWith("UPDATE") || query.ToUpper().StartsWith("DELETE"))
{
TempData["RowsAffected"] = cmd.ExecuteNonQuery();
return RedirectToAction("SQL");
}
else
{
SqlDataAdapter da = new SqlDataAdapter(cmd);
DataTable dt = new DataTable();
da.Fill(dt);
return RedirectToAction("SQL", dt);
}
But when I try that, I get A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: Named Pipes Provider, error: 40 - Could not open a connection to SQL Server). So, the question, is how can I connect to a SQL CE 4.0 database the old fashioned way? I've also tried using System.Data.SqlServerCe but then I get errors that lead me to believe that only works for CE 3.5 databases.
Any help?
It happened to me, too. It is the way the sdf ddbb works, you can't execute a query against that. You have to create dataset, dataadapter and so on.