SQLite via JDBC: SQLITE_BUSY when inserting after selecting - sqlite

I'm getting the error code SQLITE_BUSY when trying to write to a table after selecting from it. The select statement and result is properly closed prior to my insert.
If I'm removing the select part the insert works fine. And this is what I'm not getting. According to the documentation SQLITE_BUSY should mean that a different process or connection (which is definetly not the case here) is blocking the database.
There's no SQLite manager running. Also jdbcConn is the only connection to the database I have. No parallel running threads aswell.
Here's my code:
try {
if(!jdbcConn.isClosed()) {
ArrayList<String> variablesToAdd = new ArrayList<String>();
String sql = "SELECT * FROM VARIABLES WHERE Name = ?";
try (PreparedStatement stmt = jdbcConn.prepareStatement(sql)) {
for(InVariable variable : this.variables.values()) {
stmt.setString(1, variable.getName());
try(ResultSet rs = stmt.executeQuery()) {
if(!rs.next()) {
variablesToAdd.add(variable.getName());
}
}
}
}
if(variablesToAdd.size() > 0) {
String sqlInsert = "INSERT INTO VARIABLES(Name, Var_Value) VALUES(?, '')";
try(PreparedStatement stmtInsert = jdbcConn.prepareStatement(sqlInsert)) {
for(String name : variablesToAdd) {
stmtInsert.setString(1, name);
int affectedRows = stmtInsert.executeUpdate();
if(affectedRows == 0) {
LogManager.getLogger().error("Error while trying to add missing database variable '" + name + "'.");
}
}
}
jdbcConn.commit();
}
}
}
catch(Exception e) {
LogManager.getLogger().error("Error creating potentially missing database variables.", e);
}
This crashes on int affectedRows = stmtInsert.executeUpdate();. Now if I remove the first block (and manually add a value to the variablesToAdd list) the value inserts fine into the database.
Am I missing something? Am I not closing the ResultSet and PreparedStatement properly? Maybe I'm blind to my mistake from looking at it for too long.
Edit: Also executing the select in a separate thread does the trick. But that can't be the solution. Am I trying to insert into the database too fast after closing previous statements?
Edit2: I came across a busy_timeout, which promised to make updates/queries wait for a specified amount of time before returning with SQLITE_BUSY. I tried setting the busy timeout like so:
if(jdbcConn.prepareStatement("PRAGMA busy_timeout = 30000").execute()) {
jdbcConn.commit();
}
The executeUpdate() function still immedeiately returns with SQLITE_BUSY.

I'm dumb.
I was so thrown off by the fact that removing the select statement worked (still not sure why that worked, probably bad timing) that I missed a different thread using the same file.
Made both threads use the same java.sql.Connection and everything works fine now.
Thank you for pushing me in the right direction #GordThompson. Wasn't aware of the jdbc:sqlite::memory: option which led to me finding the issue.

Related

Firefox addon development: SQLite Database Connection

So I am developing this add-on using the MDN's Add-on Builder and need to connect with the SQLite Database. The connection gets created fine and insertion is fine as long as I am inserting values without binding parameters(that is, through executeSimpleSQL()). As soon as I use the createStatement() method to INSERT values, it does not work. Here's what I have done so far.
let file = FileUtils.getFile("Desk", ["my_db_file_name.sqlite"]);
let mDBConn = Services.storage.openDatabase(file);
mDBConn.executeSimpleSQL("CREATE TEMP TABLE IF NOT EXISTS element (rating VARCHAR(50))");
let stmt = mDBConn.createStatement("INSERT INTO element (rating) VALUES(:value)");
stmt.params.value = 13;
//mDBConn.executeSimpleSQL("INSERT INTO element (rating) VALUES(13)");
var statement = mDBConn.createStatement("SELECT * FROM element WHERE rating = :rat");
statement.params.rat = 13;
try {
while (statement.step()) {
let value = statement.row.rating;
console.log(value);
}
}
finally {
statement.reset();
}
Note that the SELECT statement with the bound parameters works fine, it's just the INSERT statement that's problematic.
Any ideas?
You forgot to call execute().

C# - Insert Multiple Records at once to AS400

I have a problem like this:
1. I retrieve data from MySQL using C# ASP .Net. -- done --
2. All data from no.1 will be inserted into table on AS400. -- I got an error on this step --
Error message says that ERROR [42000] [IBM][System i Access ODBC Driver][DB2 for i5/OS]SQL0104 - Token ; was not valid. Valid tokens: <END-OF-STATEMENT>.. It's true that I used semicolon to separate queries with each others, but it's not allowed. I've Googling but I can't find the solution.
My question is what the <END-OF-STATEMENT> means of that error message..?
Here is my source code.
private static void doInsertDOCADM(MySqlConnection conn)
{
// Get Temporary table
String query = "SELECT * FROM TB_T_DOC_TEMPORARY_ADM";
DataTable dt = CSTDDBUtil.ExecuteQuery(query);
OdbcConnection as400Con = null;
as400Con = CSTDDBUtil.GetAS400Connection();
as400Con.Open();
if (dt != null && dt.Rows.Count > 0)
{
int counter = 1, maxInsertLoop = 50;
using (OdbcCommand cmd = new OdbcCommand())
{
cmd.Connection = as400Con;
foreach (DataRow dr in dt.Rows)
{
cmd.CommandText += "INSERT INTO DCDLIB.WDFDOCQ VALUES " + "(?,?,?,?);";
cmd.Parameters.Add("1", OdbcType.VarChar).Value = dr["PROD_MONTH"].ToString();
cmd.Parameters.Add("2", OdbcType.VarChar).Value = dr["NEW_MAIN_DEALER_CD"].ToString();
cmd.Parameters.Add("3", OdbcType.VarChar).Value = dr["MODEL_SERIES"].ToString();
cmd.Parameters.Add("4", OdbcType.VarChar).Value = dr["MODEL_CD"].ToString();
if (counter < maxInsertLoop)
{
counter++;
}
else
{
counter = 1;
cmd.ExecuteNonQuery();
cmd.CommandText = "";
cmd.Parameters.Clear();
}
}
if (counter > 1) cmd.ExecuteNonQuery();
}
}
Notes: I used this way (Collect some queries first, and then execute those query) to improve the performance of my application.
As Clockwork-Muse pointed out, the problem is that you can only run a single SQL statement in a command. The iSeries server does not handle multiple statements at once.
If your iSeries server is running V6R1 or later, you can use block inserts to insert multiple rows. I'm not sure if you can do so through the ODBC driver, but since you have Client Access, you should be able to install the iSeries ADO.NET driver. There are not many differences between the ADO.NET iSeries driver and the ODBC one, but with ADO.NET you get access to iSeries specific functions.
With the ADO.NET driver, multiple insert become a simple matter of :
using (iDB2Connection connection = new iDB2Connection(".... connection string ..."))
{
// Create a new SQL command
iDB2Command command =
new iDB2Command("INSERT INTO MYLIB.MYTABLE VALUES(#COL_1, #COL_2", connection);
// Initialize the parameters collection
command.DeriveParameters();
// Insert 10 rows of data at once
for (int i = 0; i < 20; i++)
{
// Here, you set your parameters for a single row
command.Parameters["#COL_1"].Value = i;
command.Parameters["#COL_2"].Value = i + 1;
// AddBatch() tells the command you're done preparing a row
command.AddBatch();
}
// The query gets executed
command.ExecuteNonQuery();
}
}
There is also some reference code provided by IBM to do block inserts using VB6 and ODBC, but I'm not sure it can be easily ported to .NET : http://publib.boulder.ibm.com/infocenter/iseries/v5r4/index.jsp?topic=%2Frzaik%2Frzaikextfetch.htm
Hope that helps.
When it says <END-OF-STATEMENT> it means about what it says - it wants that to be the end of the executed statement. I don't recall if the AS/400 allows multiple statements per execution unit (at all), but clearly it's not working here. And the driver isn't dealing with it either.
Actually, you have a larger, more fundamental problem; specifically, you're INSERTing a row at a time (usually known as row-by-agonizing-row). DB2 allows a comma-separated list of rows in a VALUES clause (so, INSERT INTO <table_name> VALUES(<row_1_columns>), (<row_2_columns>)) - does the driver you're using allow you to provide arrays (either of the entire row, or per-column)? Otherwise, look into using extract/load utilities for stuff like this - I can guarantee you that this will speed up the process.

SQL always returning same results

I'm trying to do a search on a table in my database where it returns the top 50 rows with a firstname like the search term being passed to the function, but its always returning the same 50 results
My sql looks like this:
Select TOP(50) *
FROM [database].[dbo].[records]
WHERE (A_1STNAME LIKE '" + #searchTerm + "%')
ORDER BY A_RECID
When I run this query in Visual Studios query window, it works as expected but when I run it through my ASP.NET application, it always returns the same 50 results, and only one of them has a first name close to the searchTerm I passed to it.
here is the page code that runs the function:
protected void Page_Load(object sender, EventArgs e)
{
_migrated_data data = new _migrated_data();
DataSet ds = data.Search(Request.QueryString.Get("query"), "A_RECID", 50);
if (ds.Tables.Count > 0 && ds.Tables[0].Rows.Count > 0)
{
rpt_Data.DataSource = ds.Tables[0].DefaultView;
rpt_Data.DataBind();
}
}
and here is the search method of _migrated_data:
public DataSet Search(String #pSearchTerm, String #pSortBy, int #pRowCount)
{
DataSet ds = new DataSet();
OleDbConnection objOleDBConn;
OleDbDataAdapter objOleDBDa;
objOleDBConn = new OleDbConnection(ClearingHouse_OLEDDB);
objOleDBConn.Open();
string lSQL = "SELECT TOP(50) * FROM [database].[dbo].[records]";
lSQL += " WHERE (A_1STNAME LIKE #searchTerm ) ORDER BY #sortBy";
SqlCommand t = new SqlCommand(lSQL);
if (pSearchTerm != null && pSearchTerm != "")
{
t.Parameters.AddWithValue("#searchTerm", #pSearchTerm + "%");
}
if (pSortBy != null && pSortBy != "")
{
t.Parameters.AddWithValue("#sortBy", #pSortBy);
}
else
{
t.Parameters.AddWithValue("#sortBy", "A_RECID");
}
objOleDBDa = new OleDbDataAdapter(t.CommandText, objOleDBConn);
objOleDBDa.SelectCommand.CommandType = CommandType.Text;
objOleDBDa.Fill(ds);
objOleDBConn.Close();
return ds;
}
Using locals to view the final CommandText of t, I get the sql results I gave above.
Any help is greatly appriciated :)
AS Rob Rodi said, first of all, to get the results begining with a value you will need % char at the end of the therm:
Select TOP(50) *
FROM [database].[dbo].[records]
WHERE (A_1STNAME LIKE '" + #searchTerm + "%')
ORDER BY A_RECID
Second, probably your searchTerm is empty, so it's grabbing always the same results.
Debug your app and watch the variable to see if it's receiving some value.
You are having SQL injection there. Also it looks like the %-signs are missing.
The fact, that the query works in a query window and not in your app means, that the error is not in your query! Did you think about that? Probably, you should post code of your ASP.NET app.
It sounds like your parameter isn't being correctly passed to your data layer. Easiest way to find out if that's the case is to turn on Sql Profiler and check to see if it's being passed. If it is, your sql's to blame. If it's not, it's your application. You can then use the debugger on your application to work your way up the stack to see where the problem lies.

qt sqlite select statement

I decided to test sqlite db for my Qt application.
I have created the sqlite db file with the proper statements (create table etc. and Inserted some rows of data).
My problem is that when I execute a select statement I don't get any records.
This is the code I use:
qq.sprintf("SELECT * from descriptors WHERE descriptors.id=%d ",idx);
query.exec(qq);
if( query.isSelect() ){
while (query.next()){
int fff = query.value(0).toInt();
}}
The problem is that I never get inside the while loop. query.next() doesn't seem to work.
any hints?
thanks in advance,
Thodoris
p.s. I forgot to write my configuration so: Qt 4.7.3, windows 7, visual studio 2008
Other than the mistake hexa posted, query.isSelect() will always return true even if the query failed. You need to check the result of exec():
QSqlQuery query;
query.prepare( "SELECT * FROM descriptors WHERE id = ?" );
query.bindValue( 0, idx ); // assuming idx is an integer/long/QVariant value
if( !query.exec() )
{
// Error Handling, check query.lastError(), probably return
}
// Note: if you don't return in case of an error, put this into the else{} part
while( query.next() )
{
int fff = query.value( 0 ).toInt();
}
In my case, backward iteration over QSqlQueries worked. I think this could be a bug somewhere in the QSQLite Driver implementation.
QSqlQuery q = db.exec("SELECT * FROM Table");
if (q.last()) {
do {
// Do something with row...
} while (q.previous());
}

NS_ERROR_FILE_IS_LOCKED when several requests using SQLite on Mozilla plateform

I'm trying to use Storage mechanism in Mozilla platform (in thundebird 3.0).
The following code is used after each test to erase the table present in the database:
function tearDown()
{
let database = new Database();
let req1 = "SELECT name FROM sqlite_master WHERE type='table'";
let statement = database.connection.createStatement(req1);
let tables = [];
while(statement.executeStep()) {
tables.push(statement.row.name);
}
statement.reset();
for(table in tables) {
let req2 = "DROP TABLE " + tables[table];
database.connection.executeSimpleSQL(req2);
}
}
But I've an error during executeSimpleSQL of req2 (NS_ERROR_FILE_IS_LOCKED), It seems that SQLite doesn't release the lock from the first statement. I've tried reset(), finalize(), but nothing works. How can I properly release the lock of the first statement?
Answering myself: I forgot to release a previous statement in previous code of my application.
Final story: when you use
statement.executeStep()
Check:
be sure that the last call of this statement return false
or never forgot to release it:
statement.reset();
var statement = dbConn.createStatement("SELECT COUNT(name) AS nameOcurrences FROM Table1 WHERE name = '" + aName + "';");
var occurrences;
while(statement.executeStep()) {
occurrences = statement.row.nameOcurrences;
}

Resources