MySQL "max_execution_time" defaults to 30000ms when using the ODBC connector in Classic ASP, and cannot be changed - asp-classic

This is an issue that has been bugging me for several months now, and one which I have yet to find a solution for.
The default max_execution_time when using the MySQL ODBC connector (8.0) in a Classic ASP application is set to 30000ms (30 seconds), and I can't figure out how to increase it.
I have a large table (400,000+ rows) and I'm using the UPDATE command to perform various calculations several times a day, usually this takes less than 30 seconds, but as the table continues to grow I'm beginning to see more and more [MySQL][ODBC 8.0(w) Driver][mysqld-8.0.15]Query execution was interrupted, maximum statement execution time exceeded errors being logged.
I keep coming across the same answers when searching for a solution:
Set max_execution_time = x in my.ini
Or execute SET [GLOBAL]/[SESSION] max_execution_time = x; before
running a SQL command.
These solutions work fine, but only when you use a MySQL client (such as MySQL Workbench), but they make no difference when using the ODBC connector within a Classic ASP application.
The following code will always output max_execution_time: 30000
Dim dbConn, dbRS
Set dbConn = Server.CreateObject("ADODB.Connection") : dbConn.Open("DSN=my_system_dsn")
Set dbRS = dbConn.Execute("SHOW VARIABLES WHERE Variable_name = 'max_execution_time';")
Response.Write(dbRS("Variable_name") & ": " & dbRS("Value"))
dbRS.close() : Set dbRS = nothing
dbConn.close() : Set dbConn = nothing
Another solution that I've come across is to use:
SELECT /*+ MAX_EXECUTION_TIME(x) */ FROM TABLE...
But this only works for read-only SELECT statements, and not UPDATE commands.
It doesn't seem to matter what I change, whether it's in the MySQL ODBC Data Source Configuration, changes to the connection string, or the MySQL ini file, the max_execution_time remains at 30000ms when using the ODBC connector to access MySQL within Classic ASP.
I did find the following bug report and discussion on the MySQL website from 2016 where people were reporting the same issue with the ODBC 5.x connector when used with Classic ASP, but nobody was able to offer a solution.
I'm just wondering if anybody else has come across this issue and was able to find a way of increasing the max_execution_time value?
The solution thanks to #Shadow
Dim dbConn, dbRS
Set dbConn = Server.CreateObject("ADODB.Connection")
dbConn.Open("DSN=my_system_dsn")
dbConn.CommandTimeout = 120
Set dbRS = dbConn.Execute("SHOW VARIABLES WHERE Variable_name = 'max_execution_time';")
Response.Write(dbRS("Variable_name") & ": " & dbRS("Value"))
dbRS.close() : Set dbRS = nothing
dbConn.close() : Set dbConn = nothing
Output: max_execution_time: 120000

Increase the CommandTimeout parameter of the ADO Connection or Command object before executing the query. This property configures how long the driver waits for the execution of a query.

Related

How to achieve an exclusive connection with FireDAC SQLite?

When FDConnection is using the SQLite Driver it has a LockingMode property that is set to Exclusive by default. However, this does not seem to work as expected.
When running the below code, an error does not occur when opening the second connection:
FDConnection1.Params.Database := DB_PATH;
FDConnection1.Open();
FDQuery1.SQL.Text := 'update admin set last_write = 2';
FDQuery1.ExecSQL;
FDConnection2.Params.Database := DB_PATH;
FDConnection2.Open();
Specifically setting the SQLite pragma for Exclusive locking mode also does not seem to work:
FDConnection1.Params.Database := DB_PATH;
FDConnection1.Open();
FDQuery1.SQL.Text := 'PRAGMA locking_mode = EXCLUSIVE';
FDQuery1.ExecSQL;
FDQuery1.SQL.Text := 'update admin set last_write = 2';
FDQuery1.ExecSQL;
FDConnection2.Params.Database := DB_PATH;
FDConnection2.Open();
Again, no error on opening the second connection.
How does one effect an Exclusive locking mode when opening a SQLite database? Why does setting the PRAGMA manually not work?
EDIT
After further testing, I see that opening a second connection with a different component set e.g. UniDAC or ZeosLib, does in fact result in an error.
However, no error occurs when opening a second FDConnection or even writing to that connection. It seems like FireDAC connections are in some way shared no matter what..
I think you are misunderstanding the meaning of the EXCLUSIVE lock.
from the SQLite 3 documentation it is:
An EXCLUSIVE lock is needed in order to write to the database file.
Only one EXCLUSIVE lock is allowed on the file and no other locks of
any kind are allowed to coexist with an EXCLUSIVE lock. In order to
maximize concurrency, SQLite works to minimize the amount of time that
EXCLUSIVE locks are held.
this lock is requested only when trying to write to the database file. (see: 5.0 Writing to a database file)
To confirm this I made a simple test with SQLiteStudio and a simple Delphi application where I instruct the SQLiteStudio to add 1 million records and try to add one with the Delphi app. I always get a Firedac error Database is locked.

Oracle connection pooling in ASP.net

I have an old intranet website written with vb.net and using Oracle.ManagedDataAccess mostly doing read operations from Oracle database 11g.
My db connection code is as follows.
Public Shared Function MyDBConnection(ByVal command_text As String, ByVal connstring As String, ByVal ParamArray parameters As OracleParameter()) As DataTable
Dim OraCommand As New OracleCommand
Dim tmp As New DataTable
Using ORAconnection = New OracleConnection(ConfigurationManager.ConnectionStrings(connstring).ConnectionString)
OraCommand.CommandText = command_text
If parameters IsNot Nothing AndAlso parameters.Length > 0 Then
For Each p In parameters
OraCommand.Parameters.Add(p.ParameterName, p.OracleDbType, p.Size).Value = p.Value
Next p
End If
OraCommand.Connection = ORAconnection
Try
ORAconnection.Open()
tmp.Load(OraCommand.ExecuteReader())
Catch ex As OracleException
End Try
End Using
Return tmp
End Function
And my oracle connection string is like this
<add name="ConnectionString" connectionString="User Id=userid;Password=userpadd;Data Source=servername:port/port_dp"/>
I was testing if my connections to the database were closing properly but it looks like connections on the database stayed open after being closed on my code. Eventually they close way after my query completed about 2 minutes, 10 minutes or hours later.
Is this connection pooling at work? or if there is something wrong with my code?
After reading about oracle pooling, it looks like the application should be reusing the same opened connection on the DB but in my case it looks like is opening new connections anyway.
So my question is, should I disable pooling on my connection string to make sure all connections open/close and not have connections lingering on the DB?
No, you should not disable connection pooling. In .NET connection pooling is managed by a mechanism outside of your reach, and you should proceed as if it is not there: open and close your connections as you normally would (i.e. at the beginning and end of every set of operations that you wish to enroll in a transaction/every set of ops that defines a good "unit of work" such as running a report, updating a table etc
The action of opening and closing connections in your code simply leases the from/returns them to the pool. .NET will manage the rest regarding maintaining a cache of some open connections to the db
Yes, there's pooling. The number of connection might have to do with the default value of min and max pool size. Also, OracleCommand does have a Dispose method that you are not calling.
Added: I see your empty catch statement, you don't need it when using "Using" (or ever). The Dispose will still be called if there's an exception.

oracle ExecuteNonQuery freezes on ASP.Net

I am trying to run a non query using a Oracle connection in ASP C# with CLR 4.5. Here is my code:
string connectionString = ConfigurationManager.ConnectionStrings["OracleConnectionString1"].ConnectionString;
OracleConnection conn = new OracleConnection(connectionString);
conn.Open();
OracleCommand cmd = new OracleCommand();
cmd.Connection = conn;
cmd.CommandText = "update SALES_ADVENTUREWORKS2012.SALESORDERDETAIL set UNITPRICEDISCOUNT=0 where ROWGUID='4A399178-C0A0-447E-9973-6AB903B4AECD'";
cmd.CommandType = CommandType.Text;
cmd.CommandTimeout = QUERY_TIMEOUT;
int row_affected = cmd.ExecuteNonQuery();
HttpContext.Current.Response.Write("Rows affected:" + row_affected + "<br/>");
conn.Close();
when I run the query in oracle development tool, it works fine.
when I use the asp code above, it freezes when performing the query. It freezes forever even though I used a 5 second timeout.
I've tried using the managed and unmanaged oracle libraries; both behave the same.
Note that using the fill or scalar query work perfectly fine so there is nothing wrong with my connection string. Also the fact that oracle development can perform this update query proves that this is not a permission problem.
Any ideas?
Most likely your query is waiting to get access to the record. You probably have modified that row in "oracle development tool" and have not committed or rolled back that transaction.
Just commit/rollback in your tool or close open session.
You can check for open transactions in v$transaction view.
More on automatic locks in Oracle:
http://docs.oracle.com/cd/E11882_01/server.112/e41084/ap_locks001.htm
Are you certain you are using the 4.5 library? The 3.5 documentation states that the CommandTimeout property has no effect.
The 4.5 documentation suggests it should work, but the Remarks section doesn't mention the change, which warrants suspicion.
Otherwise, the code you posted doesn't seem to show where you actually set the value of QUERY_TIMEOUT to 5 seconds. If QUERY_TIMEOUT has a value of zero, then any other provider (SQLCommand, for example) would wait indefinitely. As vav suggested, locks from other sources could cause an indefinite wait.

SqlServer Timeout in ASP.NET

I have a SP which takes 20 seconds in SqlServer environment but sometimes in my ASP.NET page when I run the SP I get SqlServer timeout excaption.
I event set CommandTimeout and ConnectionTimeout to 60 but I still get the exception.
I would appreciate to help me.
Some other operation might be locking the table. set the timeout to a higher value and check.
while running the proc execute sp_lock and sp_who2 system procedure for any locking
You can try
cmd.CommandTimeout = 0;
if you are executing a query taking long time.
1) tried something like??
SqlCommand cmd = new SqlCommand("MyReport", conn);
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandTimeout = 3660; //// or set it zero (0)
2) and this???
3) Assuming your db server and point of execution are different, is your internet/intranet connectivity fine?
4) check for vpn connection (if used)
Execute the query from the SSMS and save the execution plan. Then run the application and have the SQL Profiler to capture the trace and then save the execution plan from profiler as mentioned in this link.
Compare the two execution plan to find out the actual difference in execution.
Check for parameter sniffing. If you still have the issue make sure the DB statistics are updated, sometimes this might be the issue after that drop and create the procedure.
I think the problem is the sending parameters from your application to store procedure.
try it again but this time use SQL Server Profiler to trace your query execution .
you can use TextData column value in SQL Server Profiler and run real executed query again to find the real problem.

Provider error '80004005' Unspecified error

I know this question has been asked before but i can't seem to see from the other posts what this could be i don't know asp i have just been uploading images and changing the database and re uploading via FTP but now i have come across a error
Provider error '80004005'
Unspecified error
/Includes/DB/DatabaseConnect.asp, line 8
<%
Dim espostiDB
Set espostiDB = Server.CreateObject("ADODB.Connection")
'espostiDB.ConnectionString = "DSN=esposti.dsn"
'espostiDB.ConnectionString = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & Server.MapPath("db\esposti1.mdb")
espostiDB.ConnectionString = "DRIVER={Microsoft Access Driver (*.mdb)};DBQ=" & Server.MapPath("db\esposti.mdb") ')e:\inetpub\wwwroot\esposti\esposti.mdb"
'espostiDB.ConnectionString = "DSN=esposti.dsn"
espostiDB.Open
%>
line 8 is espostiDB.open
Database is Access 2000
Help Much appreciated
Jack
This is how I solved this problem, the exact same error. Went to Control Panel - Administrative Tools - Internet information Services. On the right click View Application Pools and for both "Classic NET AppPool" and "DefaultAppPool" set v4.0 network framework version. If it's set to v2.0 the error above will show up.
Working with your code above, you could use this:
<%
Set espostiDB = Server.CreateObject("ADODB.Connection")
connStr = "Provider=Microsoft.Jet.OLEDB.4.0; Data Source=" & Server.MapPath("~\db\esposti.mdb")
espostiDB.Open connStr
%>
The info here will also give you some pointers of how to work Server.MapPath (in case your mdb file is not in the root of your website).
80004005 errors can be numerous things, one thing to check is that the database is not currently in an open state and therefore locked, so every time you open a connection you need to make sure that you close it. An easy way to check this is to check if there is an Access lock file (.ldb) file in the same folder as the database.
Check that you've not inadvertently changed the path so it's now incorrect (a Response.Write(Server.MapPath("db\esposti.mdb"))) should print the full path.
Also, worth recycling the IIS app pool, sometimes connections remain open in IIS and the database becomes unresponsive, this can be a sign you're not closing connections properly.
The cause of this conflicts looks to be 32 Bit Application running on the 64 Bit. Please make sure that the ODBC connection is described at C:\Windows\SysWOW64\odbcad32.exe

Resources