SqlServer Timeout in ASP.NET - asp.net

I have a SP which takes 20 seconds in SqlServer environment but sometimes in my ASP.NET page when I run the SP I get SqlServer timeout excaption.
I event set CommandTimeout and ConnectionTimeout to 60 but I still get the exception.
I would appreciate to help me.

Some other operation might be locking the table. set the timeout to a higher value and check.
while running the proc execute sp_lock and sp_who2 system procedure for any locking

You can try
cmd.CommandTimeout = 0;
if you are executing a query taking long time.

1) tried something like??
SqlCommand cmd = new SqlCommand("MyReport", conn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandTimeout = 3660; //// or set it zero (0)
2) and this???
3) Assuming your db server and point of execution are different, is your internet/intranet connectivity fine?
4) check for vpn connection (if used)

Execute the query from the SSMS and save the execution plan. Then run the application and have the SQL Profiler to capture the trace and then save the execution plan from profiler as mentioned in this link.
Compare the two execution plan to find out the actual difference in execution.
Check for parameter sniffing. If you still have the issue make sure the DB statistics are updated, sometimes this might be the issue after that drop and create the procedure.

I think the problem is the sending parameters from your application to store procedure.
try it again but this time use SQL Server Profiler to trace your query execution .
you can use TextData column value in SQL Server Profiler and run real executed query again to find the real problem.

Related

Why I am getting Timeout Error in my live site

I am facing a problem with my live site. The Issue is a SQLTimeout error.
I have followed the below scenario to solve the issue. But I can't do it .
Steps taken:
Increased the SqlCommand Timeout = 0 and 240
Increased the SqlCommand Connection Timeout = 0
I have applied raw SQL in code to fetch the data from SQL Server
Kindly share with me if you have any suggestions about this issue.
Thanks
It is really hard to address your issue with so little info you provided.
Generally I would recommend to execute your query in SQL Server Management Studio and see what happens.
It could be either really long query or locking issue in database.
Also be aware, if you host you site on IIS, that apart from SQL Server timeout, the IIS request timeout would apply.
Example :
Table 1,
Table 2 ,
Table 3
Issue Scenario:
You have request to get those three table values .But the second request used to store some value in any one table at the same time existing request is not completed .
you will get the timeout error here , because existing request is not completed ,
Solutions :
Please use isolation concept to avoid this error while writing and read and updated time
Thanks.

SQL error timeout with transaction

I have an ASP.NET application importing data from a CSV file, and storing it to a (SQL Server) database table. Basically, the import process consists of:
Importing the raw CSV data into a corresponding SQL table (with the same columns)
"Merging" the data into the DB, with some sql clauses (INSERTS and UPDATE)
The whole import procedure is wrapped with a transaction.
using (SqlConnection c = new SqlConnection(cSqlHelper.GetConnectionString()))
{
c.Open();
SqlTransaction trans = c.BeginTransaction();
SqlCommand cmd = new SqlCommand("DELETE FROM T_TempCsvImport", c, trans);
cmd.ExecuteNonQuery();
// Other import SQL ...
trans.Commit();
}
Trying this import procedure from a virtual machine (everything is local), I got an error
[SqlException (0x80131904): Timeout. The timeout period elapsed prior to completion of the operation or the server is not responding.
Trying the same without the transaction, works fine.
Something I tried:
Executing the same queries from SQL Server Management Studio, all of them runs quite fast (500ms)
Executing from my development machine, works fine
Increasing the Command Timeout, I get the error anyhow. I also tried to set CommandTimeout to 0 (infinite), and the procedure seems to run "forever" (I get a server timeout, which I set to 10 minutes)
So, the final question is: why the SQL transaction is creating such problems? Why is it working without the transaction?
After several tests I did, I found out that the problem is ...Not Enough Memory!
What I found out is that my situation is exactly the same as this answer:
Help troubleshooting SqlException: Timeout expired on connection, in a non-load situation
I have both IIS and SQL server on my local machine, with the test running on a virtual machine. This virtual machine was using 2Gb of RAM, that is 50% of the total RAM of my PC. Reducing the RAM available to the virtual machine to 512Mb fixed the problem.
Furthermore, I noticed that using a transaction or not using it has exactly the same results, when the system is working, so my first guess was wrong, as well.

oracle ExecuteNonQuery freezes on ASP.Net

I am trying to run a non query using a Oracle connection in ASP C# with CLR 4.5. Here is my code:
string connectionString = ConfigurationManager.ConnectionStrings["OracleConnectionString1"].ConnectionString;
OracleConnection conn = new OracleConnection(connectionString);
conn.Open();
OracleCommand cmd = new OracleCommand();
cmd.Connection = conn;
cmd.CommandText = "update SALES_ADVENTUREWORKS2012.SALESORDERDETAIL set UNITPRICEDISCOUNT=0 where ROWGUID='4A399178-C0A0-447E-9973-6AB903B4AECD'";
cmd.CommandType = CommandType.Text;
cmd.CommandTimeout = QUERY_TIMEOUT;
int row_affected = cmd.ExecuteNonQuery();
HttpContext.Current.Response.Write("Rows affected:" + row_affected + "<br/>");
conn.Close();
when I run the query in oracle development tool, it works fine.
when I use the asp code above, it freezes when performing the query. It freezes forever even though I used a 5 second timeout.
I've tried using the managed and unmanaged oracle libraries; both behave the same.
Note that using the fill or scalar query work perfectly fine so there is nothing wrong with my connection string. Also the fact that oracle development can perform this update query proves that this is not a permission problem.
Any ideas?
Most likely your query is waiting to get access to the record. You probably have modified that row in "oracle development tool" and have not committed or rolled back that transaction.
Just commit/rollback in your tool or close open session.
You can check for open transactions in v$transaction view.
More on automatic locks in Oracle:
http://docs.oracle.com/cd/E11882_01/server.112/e41084/ap_locks001.htm
Are you certain you are using the 4.5 library? The 3.5 documentation states that the CommandTimeout property has no effect.
The 4.5 documentation suggests it should work, but the Remarks section doesn't mention the change, which warrants suspicion.
Otherwise, the code you posted doesn't seem to show where you actually set the value of QUERY_TIMEOUT to 5 seconds. If QUERY_TIMEOUT has a value of zero, then any other provider (SQLCommand, for example) would wait indefinitely. As vav suggested, locks from other sources could cause an indefinite wait.

SQLite: SQLITE_BUSY on ATTACHed database and parallel read-only connection to the attached database

let me try to explain the problem in general parlance.
We are using SQLite 3.7.11 by System.Data.SQLite Wrapper for .NetCF in Version 1.0.80.
We have two database files:
master_data.db
inventory.db
We establish a read-only connection to master_data.db to display some information to the user.
Data Source=master_data.db;Version=3;Read Only=true;Journal Mode=OFF;Synchronous=OFF;
We establish a writable connection to inventory.db to update/insert inventory information depending on some information from master_data.db
Data Source=inventory.db;Version=3;Journal Mode=DELETE;Synchronous=OFF;
To allow consistency checks in update/insert statements, we attach the master_data.db to this connection.
ATTACH 'master_data.db' AS md_db
We start a transaction at inventory.db
SQLiteTransaction tx = connection.BeginTransaction();
We update a simple table in inventory.db without interaction of master_data.db.
using (IDbCommand cmd = connection.CreateCommand())
{
cmd.CommandText = #"UPDATE header_info SET count_time = #countTime";
SQLiteParameterparam = new SQLiteParameter("#countTime",
DateTime.Now.ToUniversalTime())
cmd.Parameters.Add(param)
return cmd.ExecuteNonQuery();
}
We commit the changes and it will hang until timeout occurs and SQLITE_BUSY is raised.
tx.Commit();// BAAM! due to SQLITE_BUSY
We do not understand what's wrong here. The established read-only connection to master_data.db cannot lock the whole database. Even if there is a second and writable (the only one) connection due to the ATTACH command - which was executed by the one and only writable inventory.db connection. We are sure ther is no second connection to inventory.db.
[EDIT]
In the case of error no other transaction to master_data.db is open. Even the connection is not in use but open.
[/EDIT]
May this issue, we are also facing, be part of the problem? SQLite: Multiple Connections to one file - the one and only writable is not persisted
Thanks for your help.
Regards,
Schibbl
When you have attached databases, a transaction always covers all those databases.
So when you want to write to a database, you have to ensure that no other connection has any open transaction on the main database or on any of the attached databases.

Running SP using exec in ado.net [duplicate]

Is there any benefit to explicitly using the StoredProcedure CommandType as opposed to just using a Text Command? In other words, is
cmd = new SqlCommand("EXEC StoredProc(#p1, #p2)");
cmd.CommandType = CommandType.Text;
cmd.Parameters.Add("#p1", 1);
cmd.Parameters.Add("#p2", 2);
any worse than
cmd = new SqlCommand("StoredProc");
cmd.CommandType = CommandType.StoredProcedure;
cmd.Parameters.Add("#p1", 1);
cmd.Parameters.Add("#p2", 2);
EDIT: Fixed bad copy paste job (again). Also, the whole point of the question is for a data access class. I'd much rather be able to pass the stored proc name and parameters in one line as opposed to extra lines for each parameter.
One difference is how message pumping happens.
Where I used to work we had a number of batch processes that ran over night. Many of them simply involved running a stored procedure. We used to schedule these using sql server jobs, but moved away from it to instead call the procedures from a .Net program. This allowed us to keep all our scheduled tasks in one place, even the ones that had nothing to do with Sql Server.
It also allowed us to build better logging functionality into the .Net program that calls the procedures, so that the logging from all of the overnight processes was consistent. The stored procedures would use the sql print and raiserror functions, and the .Net program will receive and log those. What we learned was that CommandType.StoredProcedure would always buffer these messages into batches of about 50. The .Net code wouldn't see any log events until the procedure finished or flushed the buffer, no matter what options you set on the connection or what you did in your sql. CommandType.Text fixed this for us.
As a side issue, I'd use explicit types with your query parameters. Letting .Net try to infer your parameter types can cause issues in some situations.
It's cleaner.
You're calling a stored procedure, why not just use the CommandType.StoredProcedure?

Resources