With DBeaver, MariaDB Optimizer Trace a result irrelevant to the concerned query - mariadb

With a non-root user on 10.5.18-MariaDB, we are trying to get the optimizer trace about the SQL query select * from eav_value ev; however, the result seems to be irrelevant, saying something about select database() AS DATABASE():
{
"steps": [
{
"join_preparation": {
"select_id": 1,
"steps": [
{
"expanded_query": "select database() AS `DATABASE()`"
}
]
}
},
{
"join_optimization": {
"select_id": 1,
"steps": []
}
},
{
"join_execution": {
"select_id": 1,
"steps": []
}
}
]
}
We also tried to get the optimizer traces for other queries but got the same result as above, regardless of the query.
If change to use the CLI connection, the system behaviour is OK. We prefer the GUI tools, e.g. DBeaver, though.
We are learning how to use the optimizer trace, so we wonder if we missed anything.
The commands used in the test:
select ##version
;
show create table INFORMATION_SCHEMA.OPTIMIZER_TRACE
;
set session OPTIMIZER_TRACE = "enabled=on"
;
select * from eav_value ev
;
select * from information_schema.OPTIMIZER_TRACE
;
set session OPTIMIZER_TRACE = "enabled=off"
;

DBeaver uses MariaDB Connector/J. It looks like this driver disconnects after completing the query.
Therefore, when getting for the optimizer trace, it is not the same session as the query before. The optimizer trace depends on the session.
We are still open to suggestions about how to keep DBeaver hold on to the same session and not to disconnect after each query.

Related

SQLite via JDBC: SQLITE_BUSY when inserting after selecting

I'm getting the error code SQLITE_BUSY when trying to write to a table after selecting from it. The select statement and result is properly closed prior to my insert.
If I'm removing the select part the insert works fine. And this is what I'm not getting. According to the documentation SQLITE_BUSY should mean that a different process or connection (which is definetly not the case here) is blocking the database.
There's no SQLite manager running. Also jdbcConn is the only connection to the database I have. No parallel running threads aswell.
Here's my code:
try {
if(!jdbcConn.isClosed()) {
ArrayList<String> variablesToAdd = new ArrayList<String>();
String sql = "SELECT * FROM VARIABLES WHERE Name = ?";
try (PreparedStatement stmt = jdbcConn.prepareStatement(sql)) {
for(InVariable variable : this.variables.values()) {
stmt.setString(1, variable.getName());
try(ResultSet rs = stmt.executeQuery()) {
if(!rs.next()) {
variablesToAdd.add(variable.getName());
}
}
}
}
if(variablesToAdd.size() > 0) {
String sqlInsert = "INSERT INTO VARIABLES(Name, Var_Value) VALUES(?, '')";
try(PreparedStatement stmtInsert = jdbcConn.prepareStatement(sqlInsert)) {
for(String name : variablesToAdd) {
stmtInsert.setString(1, name);
int affectedRows = stmtInsert.executeUpdate();
if(affectedRows == 0) {
LogManager.getLogger().error("Error while trying to add missing database variable '" + name + "'.");
}
}
}
jdbcConn.commit();
}
}
}
catch(Exception e) {
LogManager.getLogger().error("Error creating potentially missing database variables.", e);
}
This crashes on int affectedRows = stmtInsert.executeUpdate();. Now if I remove the first block (and manually add a value to the variablesToAdd list) the value inserts fine into the database.
Am I missing something? Am I not closing the ResultSet and PreparedStatement properly? Maybe I'm blind to my mistake from looking at it for too long.
Edit: Also executing the select in a separate thread does the trick. But that can't be the solution. Am I trying to insert into the database too fast after closing previous statements?
Edit2: I came across a busy_timeout, which promised to make updates/queries wait for a specified amount of time before returning with SQLITE_BUSY. I tried setting the busy timeout like so:
if(jdbcConn.prepareStatement("PRAGMA busy_timeout = 30000").execute()) {
jdbcConn.commit();
}
The executeUpdate() function still immedeiately returns with SQLITE_BUSY.
I'm dumb.
I was so thrown off by the fact that removing the select statement worked (still not sure why that worked, probably bad timing) that I missed a different thread using the same file.
Made both threads use the same java.sql.Connection and everything works fine now.
Thank you for pushing me in the right direction #GordThompson. Wasn't aware of the jdbc:sqlite::memory: option which led to me finding the issue.

How to add a DynamoDB global secondary Index via Python/Boto3

Is it possible to add a Global Secondary Index to and existing DynamoDB table AFTER it has been created? I am using Python 3.x with Boto3 and have not been able to find any examples of them being added to the table after it was created.
In general, yes it is possible to add a Global Secondary Index (GSI) after the table is created.
However, it can take a long time for the change to come into effect, because building the GSI requires a table scan.
In the case of boto3, have a look at the documentation for update_table
For example, you try something like this:
response = client.update_table(
TableName = 'YourTableName',
# ...snip...
GlobalSecondaryIndexUpdates=[
{
'Create': {
'IndexName': 'YourGSIName',
'KeySchema': [
{
'AttributeName': 'YourGSIFieldName',
'KeyType': 'HASH'
}
],
'Projection': {
'ProjectionType': 'ALL'
},
'ProvisionedThroughput': {
'ReadCapacityUnits': 1,
'WriteCapacityUnits': 1
}
}
}
],
# ...snip...
)

How can I see the SQL generated by SQLite.NET PCL in Xamarin Studio?

I researched this and all I can find is a suggestion to turn on .Trace = true like this:
db1 = DependencyService.Get<ISQLite>().GetConnection();
db1.Trace = true;
I also tried this:
db2.Trace = true;
var categories = db2.Query<Category>("SELECT * FROM Category ORDER BY Name").ToList();
Debug.WriteLine("xxxx");
Well I did this and then restarted the application. When I view the Application output I just see information on threads started and the xxxx but don't see any SQL trace information.
Can anyone give me advice on this. Thanks
You need to set Trace and Tracer (action) properties on your SQLiteConnection to print queries to output:
db.Tracer = new Action<string>(q => Debug.WriteLine(q));
db.Trace = true;
Look in the Application Output window for lines that begin Executing
Example Output after setting Trace to true:
Executing: create table if not exists "Valuation"(
"Id" integer primary key autoincrement not null ,
"StockId" integer ,
"Time" datetime ,
"Price" float )
Executing Query: pragma table_info("Valuation")
Executing: create index if not exists "Valuation_StockId" on "Valuation"("StockId")
Executing: insert into "Stock"("Symbol") values (?)
Executing Query: select * from "Stock" where ("Symbol" like (? || '%'))
0: A
Ref: https://github.com/praeclarum/sqlite-net/blob/38a5ae07c886d6f62cecd8fdeb8910d9b5a77546/src/SQLite.cs
The SQLite PCL uses Debug.WriteLine which means that the logs are only included in Debug builds of the PCL.
Remove your nuget reference to the sqlite.net PCL (leave the native reference), and instead add SQLite.cs as a class to your project, and execute a debug build, with the Trace flag set, and you'll see the tracing.
I didn't have to do anything special other than include the SQLite.cs file in my Xamarin iOS project for this to work:
using (var conn = new SQLite.SQLiteConnection("mydb.sqlite") { Trace = true }) {
var rows = conn.Table<PodcastMetadata>().Where(row => row.DurationMinutes < 10).Select(row => new { row.Title });
foreach (var row in rows) {
Debug.WriteLine(row);
}
}
Output:
Executing Query: select * from "PodcastMetadata" where ("DurationMinutes" < ?)
0: 10

SQLite storage API Insert statement freezes entire firefox in bootstrapped(Restartless) AddOn

Data to be inserted has just two TEXT columns whose individual length don't even exceed 256.
I initially used executeSimpleSQL since I didn't need to get any results.
It worked for simulataneous inserts of upto 20K smoothly i.e. in the bakground no lag or freezing observed.
However, with 0.1 million I could see horrible freezing during insertion.
So, I tried these two,
Insert in chunks of 500 records - This didn't work well since even for 20K records it showed visible freezing. I didn't even try with 0.1million.
So, I decided to go async and used executeAsync alongwith Bind etc. This also shows visible freezing for just 20K records. This was the whole array being inserted and not in chunks.
var dirs = Cc["#mozilla.org/file/directory_service;1"].
getService(Ci.nsIProperties);
var dbFile = dirs.get("ProfD", Ci.nsIFile);
var dbService = Cc["#mozilla.org/storage/service;1"].
getService(Ci.mozIStorageService);
dbFile.append('mydatabase.sqlite');
var connectDB = dbService.openDatabase(dbFile);
let insertStatement = connectDB.createStatement('INSERT INTO my_table
(my_col_a,my_col_b) VALUES
(:myColumnA,:myColumnB)');
var arraybind = insertStatement.newBindingParamsArray();
for (let i = 0; i < my_data_array.length; i++) {
let params = arraybind.newBindingParams();
// Individual elements of array have csv
my_data_arrayTC = my_data_array[i].split(',');
params.bindByName("myColumnA", my_data_arrayTC[0]);
params.bindByName("myColumnA", my_data_arrayTC[1]);
arraybind.addParams(params);
}
insertStatement.bindParameters(arraybind);
insertStatement.executeAsync({
handleResult: function(aResult) {
console.log('Results are out');
},
handleError: function(aError) {
console.log("Error: " + aError.message);
},
handleCompletion: function(aReason) {
if (aReason != Components.interfaces.mozIStorageStatementCallback.REASON_FINISHED)
console.log("Query canceled or aborted!");
console.log('We are done inserting');
}
});
connectDB.asyncClose(function() {
console.log('[INFO][Write Database] Async - plus domain data');
});
Also, I seem to get the async callbacks after a long time. Usually, executeSimpleSQL is way faster than this.If I use SQLite Manager Tool extension to open the DB immediately this is what I get ( as expected )
SQLiteManager: Error in opening file mydatabase.sqlite - either the file is encrypted or corrupt
Exception Name: NS_ERROR_STORAGE_BUSY
Exception Message: Component returned failure code: 0x80630001 (NS_ERROR_STORAGE_BUSY) [mozIStorageService.openUnsharedDatabase]
My primary objective was to dump data as big as 0.1 million + and then later on perform reads when needed.

Firefox addon development: SQLite Database Connection

So I am developing this add-on using the MDN's Add-on Builder and need to connect with the SQLite Database. The connection gets created fine and insertion is fine as long as I am inserting values without binding parameters(that is, through executeSimpleSQL()). As soon as I use the createStatement() method to INSERT values, it does not work. Here's what I have done so far.
let file = FileUtils.getFile("Desk", ["my_db_file_name.sqlite"]);
let mDBConn = Services.storage.openDatabase(file);
mDBConn.executeSimpleSQL("CREATE TEMP TABLE IF NOT EXISTS element (rating VARCHAR(50))");
let stmt = mDBConn.createStatement("INSERT INTO element (rating) VALUES(:value)");
stmt.params.value = 13;
//mDBConn.executeSimpleSQL("INSERT INTO element (rating) VALUES(13)");
var statement = mDBConn.createStatement("SELECT * FROM element WHERE rating = :rat");
statement.params.rat = 13;
try {
while (statement.step()) {
let value = statement.row.rating;
console.log(value);
}
}
finally {
statement.reset();
}
Note that the SELECT statement with the bound parameters works fine, it's just the INSERT statement that's problematic.
Any ideas?
You forgot to call execute().

Resources