This is my first SQLite + VC 2015 program, my project is in UTF-8.
I have SQLite table in which I want to save Chinese.
For example, I have table:
Cities {
Id TEXT PRIMARY KEY,
Desc TEXT }
Then, I have a dialog with a textfield, user input the City name there, and a CString variable m_szName link to it.
And, I have a piece code to insert the city into table:
stringstream sql;
sql << "INSERT OR REPLACE INTO Cities "
<< " (Id,Desc) VALUES ('1001','" << m_szName.GetBuffer() << "')";
Now the problem is, m_szName.GetBuffer() returns TCHAR*, so above program has syntax error.
If I use "wstringstream sql", above code is good, but then it's not accepted by sqlite3_exec since it only accepts (char*).
I tried to convert TCHAR* to char* here and there, but nothing works.
Please help, thanks.
sqlite3_exec() is one of the few functions that does not have a UTF-16 version, so you have to correctly convert the string contents into UTF-8:
CStringA str_utf8 = CW2A(m_szName.GetBuffer(), CP_UTF8);
However, your code will blow up when the name contains a quote. It would be a much better idea to use parameters, where it is also possible to use UTF-16 strings directly:
const char *sql = "INSERT ... VALUES ('1001', ?)";
sqlite3_stmt *stmt;
rc = sqlite3_prepare_v2(db, sql, -1, &stmt, NULL);
if (rc != SQLITE_OK) {
cerr << sqlite3_errmsg(db);
return;
}
sqlite3_bind_text16(stmt, 1, m_szName.GetBuffer(), -1, SQLITE_STATIC);
rc = sqlite3_step(stmt);
if (rc != SQLITE_DONE)
cerr << sqlite3_errmsg(db);
sqlite3_finalize(stmt);
Related
I have a such problem:
I develop a simple database (PostgreSql).
One of the features that it keeps in column (column name: "ascii") a big text data (2000 rows) that reads from file (datatype in the table is BYTEA).
Here is the example of the code where I insert the data:
QFile *asciiFile = new QFile(ui->ascii_lineEdit->text());
asciiFile->open(QIODevice::ReadOnly);
QByteArray asciiArray = asciiFile->readAll();
qDebug() << asciiArray;
QSqlQuery *sourceQuery = new QSqlQuery();
sourceQuery->prepare("INSERT INTO source (image, image_material, archive, ascii, about) VALUES (:image, :image_material, :archive, :ascii, :about)");
sourceQuery->bindValue(":image", graphArray.toBase64());
sourceQuery->bindValue(":image_material", materialArray.toBase64());
sourceQuery->bindValue(":archive", archiveArray.toBase64());
sourceQuery->bindValue(":ascii", asciiArray.toBase64());
sourceQuery->bindValue(":about", ui->about_textEdit->toPlainText());
sourceQuery->exec();
Once more, the datatype of the column("ascii") that keeps is BYTEA.
Now I try to read this data (near 2000 lines) and they not displayed as well.
I try this method:
QByteArray asciiArray = QSqlQuery query = connector->getSourceAscii(item_id);
query.next();
QByteArray asciiArray = QByteArray::fromBase64(query.value("ascii").toByteArray());
QString *result = new QString(asciiArray);
qDebug() << *result;
It must be a lines of the digits but I have something like this:
�w߇�k��ۇ������z���������燸��wӇ�k�{�����w��};��{��x{��k�{k�{k�xk�x߾���{k�{�����������wㇸ��
The examples of the lines that I need:
2490 0,21979421377182 4,82690520584583E-02
2491 0,226718083024025 4,33071963489056E-02
Have you any suggestion about this? Thanks.
In Qt5.4 using QSqlDatabase with sqlite3 on Ubuntu14.04 64bit:
First I open and call transaction() on the db.
Next I make 54 individual insert queries, each prepared, each deleted after execution.
Finally I call commit().
All calls complete without error and still the execution time is horrible (around 500 ms total for 54 trivial inserts).
My computer is reasonably modern and has striped SSD disks for performance. When accessing the sqlite file using Sqliteman it is blazingly fast.
So what is going on?
Here is the insert:
void BottleRigStorage::upsertTag(Tag &tag){
//ScopedTimer st("query time for tag");
if(open()){
QSqlQuery query(db);
query.prepare("INSERT OR REPLACE INTO tags ("
" id"
", batchID"
", retries"
", good"
", status"
", color"
", firstCheckTimestamp"
", createdTimestamp"
", modifiedTimestamp"
", fulfilledTimestamp"
") VALUES ("
" :id"
", :batchID"
", :retries"
", :good"
", :status"
", :color"
", :firstCheckTimestamp"
", :createdTimestamp"
", :modifiedTimestamp"
", :fulfilledTimestamp"
");");
query.bindValue(":id", tag.id);//8 chars
query.bindValue(":batchID", tag.batchID);//8 chars
query.bindValue(":retries", tag.retries);//int
query.bindValue(":good",tag.good?1:0);//bool
query.bindValue(":status", tag.status);//6 chars
query.bindValue(":color", tag.color);//7 chars
query.bindValue(":firstCheckTimestamp", tag.firstCheckTimestamp); //long
query.bindValue(":createdTimestamp", tag.createdTimestamp);//long
query.bindValue(":modifiedTimestamp", tag.modifiedTimestamp);//long
query.bindValue(":fulfilledTimestamp", tag.fulfilledTimestamp);//long
if (query.exec()) {
//qDebug() << "Successfully updated tag database after "<<st.getIntervalCompleteString();
}
else {
qWarning() << "ERROR: could not upsert tag with id " << tag.id<< ". Reason: "<< query.lastError();
}
query.finish();
}
else {
qWarning() << "ERROR: DB not open for upsert tag sqlite3";
}
}
UPDATE: And here is open() as requested:
bool BottleRigStorage::open(){
if(!db.isOpen()){
if(!db.open()){
qWarning() << "ERROR: could not open database. Reason: "<<db.lastError();
}
}
return db.isOpen();
}
Use prepare only once. Your code is preparing query each
time after QSqlQuery creation. You need to create
QSqlQuery with preparing outside of function, and just use value
binding and sql query exec in function:
void BottleRigStorage::upsertTag(Tag &tag){
//ScopedTimer st("query time for tag");
if(open()){
query.bindValue(":id", tag.id);//8 chars
query.bindValue(":batchID", tag.batchID);//8 chars
query.bindValue(":retries", tag.retries);//int
query.bindValue(":good",tag.good?1:0);//bool
query.bindValue(":status", tag.status);//6 chars
query.bindValue(":color", tag.color);//7 chars
query.bindValue(":firstCheckTimestamp", tag.firstCheckTimestamp); //long
query.bindValue(":createdTimestamp", tag.createdTimestamp);//long
query.bindValue(":modifiedTimestamp", tag.modifiedTimestamp);//long
query.bindValue(":fulfilledTimestamp", tag.fulfilledTimestamp);//long
if (query.exec()) {
//qDebug() << "Successfully updated tag database after "<<st.getIntervalCompleteString();
}
else {
qWarning() << "ERROR: could not upsert tag with id " << tag.id<< ". Reason: "<< query.lastError();
}
query.finish();
}
else {
qWarning() << "ERROR: DB not open for upsert tag sqlite3";
}
}
Query object in this case can be a private member and create, for example, after database initialization.
You can tuning sqlite database via pragmas. For example, next code will increase executing of queries:
m_pDatabase->exec("PRAGMA synchronous = OFF");
m_pDatabase->exec("PRAGMA journal_mode = MEMORY");
More information about this you can reade here
I was Facing the same issue when I had like 99 Programs and each and everyone of that had 99 Steps and I was reading that data from Pendrive from CSV file and inserting them into DB. it was taking more than 5 min but after that, I have made few changes in
main.cpp
db.open();
db.exec("PRAGMA synchronous = OFF");
db.exec("PRAGMA journal_mode = MEMORY");
and added db commit on the class for insert query
model.cpp
qDebug()<<"can start a transaction PrgQuery:"<<QSqlDatabase::database().transaction();
query.prepare("insert query");
query.exec();
qDebug()<<"end transaction Step Query:"<<QSqlDatabase::database().commit();
This solved my problem and minimize the time to like 10 sec. Pretty Fast like Unlimited Power
I'm learning Sqlite in Qt but have run into a problem accessing record values returned by a QSqlQuery.
The details are below but the gist is: I get a QSqlRecord back from a query and want to access all fields of the record but QSqlRecord.count is reporting only one column when there clearly are two (in the example they are id and keyword).
Am I misunderstanding SQLite and what a query does, or is this a problem with how I am trying to access the records?
This is my schema:
This is my test data:
Full code:
void MainWindow::on_addKeywordBtn_clicked()
{
// find a matching keyword
QSqlQuery query(db);
query.prepare("SELECT keyword FROM keywords WHERE keyword = ?");
query.addBindValue(QString("blue"));
query.exec();
while (query.next()) {
QString k = query.value(0).toString();
qDebug() << "found" << k;
QSqlRecord rec = query.record();
qDebug() << "Number of columns: " << rec.count();
int idIndex = rec.indexOf("id");
int keywordIndex = rec.indexOf("keyword");
qDebug() << query.value(idIndex).toString() << query.value(keywordIndex).toString();
}
}
Console output:
found "blue"
Number of columns: 1
QSqlQuery::value: not positioned on a valid record
"" "blue"
Your mistake is in this line, actually query
query.prepare("SELECT keyword FROM keywords WHERE keyword = ?");
in your code you explicitly instruct database to return you only one column, proper solutions would be:
query.prepare("SELECT * FROM keywords WHERE keyword = ?");
or
query.prepare("SELECT id, keyword FROM keywords WHERE keyword = ?");
I have a lots of data and I want to insert to DB in the least time. I did some tests. I created a table (using the below script) in PostgreSQL:
CREATE TABLE test_table
(
id serial NOT NULL,
item integer NOT NULL,
count integer NOT NULL,
CONSTRAINT test_table_pkey PRIMARY KEY (id)
)
WITH (
OIDS=FALSE
);
ALTER TABLE test_table OWNER TO postgres;
I wrote test code, created 1000 random values and insert to test_table in two different ways. First, using QSqlQuery::exec()
int insert() {
QSqlDatabase db = QSqlDatabase::addDatabase("QPSQL");
db.setHostName("127.0.0.1");
db.setDatabaseName("TestDB");
db.setUserName("postgres");
db.setPassword("1234");
if (!db.open()) {
qDebug() << "can not open DB";
return -1;
}
QString queryString = QString("INSERT INTO test_table (item, count)"
" VALUES (:item, :count)");
QSqlQuery query;
query.prepare(queryString);
QDateTime start = QDateTime::currentDateTime();
for (int i = 0; i < 1000; i++) {
query.bindValue(":item", qrand());
query.bindValue(":count", qrand());
if (!query.exec()) {
qDebug() << query.lastQuery();
qDebug() << query.lastError();
}
} //end of for i
QDateTime end = QDateTime::currentDateTime();
int diff = start.msecsTo(end);
return diff;
}
Second using QSqlQuery::execBatch:
int batchInsert() {
QSqlDatabase db = QSqlDatabase::addDatabase("QPSQL");
db.setHostName("127.0.0.1");
db.setDatabaseName("TestDB");
db.setUserName("postgres");
db.setPassword("1234");
if (!db.open()) {
qDebug() << "can not open DB";
return -1;
}
QString queryString = QString("INSERT INTO test_table (item, count)"
" VALUES (:item, :count)");
QSqlQuery query;
query.prepare(queryString);
QVariantList itemList;
QVariantList CountList;
QDateTime start = QDateTime::currentDateTime();
for (int i = 0; i < 1000; i++) {
itemList.append(qrand());
CountList.append(qrand());
} //end of for i
query.addBindValue(itemList);
query.addBindValue(CountList);
if (!query.execBatch())
qDebug() << query.lastError();
QDateTime end = QDateTime::currentDateTime();
int diff = start.msecsTo(end);
return diff;
}
I found that there is no difference between them:
int main() {
qDebug() << insert() << batchInsert();
return 1;}
Result:
14270 14663 (milliseconds)
How can I improve it?
In http://doc.qt.io/qt-5/qsqlquery.html#execBatch has been cited:
If the database doesn't support batch executions, the driver will
simulate it using conventional exec() calls.
I'm not sure my DBMS support batch executions or not?
How can I test it?
In not sure what the qt driver does, but PostgreSQL can support running multiple statements in one transaction. Just do it manually instead of trying to use the built in feature of the driver.
Try changing your SQL statement to
BEGIN TRANSACTION;
For every iteration of loop run an insert statement.
INSERT HERE;
Once end of loop happens for all 1000 records issue this. On your same connection.
COMMIT TRANSACTION;
Also 1000 rows is not much to test with, you might want to try 100,000 or more to make sure the qt batch really wasn't helping.
By issuing 1000 insert statements, you have 1000 round trips to the database. This takes quite some time (network and scheduling latency). So try to reduce the number of insert statements!
Let's say you want to:
insert into test_table(item, count) values (1000, 10);
insert into test_table(item, count) values (1001, 20);
insert into test_table(item, count) values (1002, 30);
Transform it into a single query and the query will need less than half of the time:
insert into test_table(item, count) values (1000, 10), (1001, 20), (1002, 30);
In PostgreSQL, there is another way to write it:
insert into test_table(item, count) values (
unnest(array[1000, 1001, 1002])
unnest(array[10, 20, 30]));
My reason for presenting the second way is that you can pass all the content of a big array in a single parameter (tested with in C# with the database driver "Npgsql"):
insert into test_table(item, count) values (unnest(:items), unnest(:counts));
items is a query parameter with the value int[]{100, 1001, 1002}
counts is a query parameter with the value int[]{10, 20, 30}
Today, I have cut down the running time of 10,000 inserts in C# from 80s to 550ms with this technique. It's easy. Furthermore, there is not any hassle with transactions, as a single statement is never split into multiple transactions.
I hope this works with the Qt PostgreSQL driver, too. On the server side, you need PostgreSQL >= 8.4., as older versions do not provide unnest (but there may be work arounds).
You can use QSqlDriver::hasFeature with argument QSqlDriver::BatchOperations
In the 4.8 sources, I found that only oci (oracle) support the BatchOperations. Don't know why not use the COPY statement for postgresql in the psql driver.
I'm working on the development of a C++ API which uses custom-designed plugins
to interface with different database engines using their APIs and specific SQL
syntax.
Currently, I'm attempting to find a way of inserting BLOBs, but since NULL is
the terminating character in C/C++, the BLOB becomes truncated when constructing
the INSERT INTO query string. So far, I've worked with
//...
char* sql;
void* blob;
int len;
//...
blob = some_blob_already_in_memory;
len = length_of_blob_already_known;
sql = sqlite3_malloc(2*len+1);
sql = sqlite3_mprintf("INSERT INTO table VALUES (%Q)", (char*)blob);
//...
I expect that, if it is at all possible to do it in the SQLite3 interactive console, it should be possible to construct the query string with properly escaped NULL characters. Maybe there's a way to do this with standard SQL which is also supported by SQLite SQL syntax?
Surely someone must have faced the same situation before. I've googled and found some answers but were in other programming languages (Python).
Thank you in advance for your feedback.
Thank you all again for your feedback. This time I'm reporting how I solved the problem with the help of the indications provided here. Hopefully this will help others in the future.
As suggested by the first three posters, I did use prepared statements — additionally because I was also interested in getting the columns' data types, and a simple sqlite3_get_table() wouldn't do.
After preparing the SQL statement in the form of the following constant string:
INSERT INTO table VALUES(?,?,?,?);
it remains the binding of the corresponding values. This is done by issuing as many sqlite3_bind_blob() calls as the columns. (I also resorted to sqlite3_bind_text() for other "simple" data types because the API I'm working on can translate integers/doubles/etc into a string). So:
#include <stdio.h>
#include <string.h>
#include <sqlite3.h>
/* ... */
void* blobvalue[4] = { NULL, NULL, NULL, NULL };
int blobsize[4] = { 0, 0, 0, 0 };
const char* tail = NULL;
const char* sql = "INSERT INTO tabl VALUES(?,?,?,?)";
sqlite3_stmt* stmt = NULL;
sqlite3* db = NULL;
/* ... */
sqlite3_open("sqlite.db", &db);
sqlite3_prepare_v2(db,
sql, strlen(sql) + 1,
&stmt, &tail);
for(unsigned int i = 0; i < 4; i++) {
sqlite3_bind_blob(stmt,
i + 1, blobvalue[i], blobsize[i],
SQLITE_TRANSIENT);
}
if(sqlite3_step(stmt) != SQLITE_DONE) {
printf("Error message: %s\n", sqlite3_errmsg(db));
}
sqlite3_finalize(stmt);
sqlite3_close(db);
Note also that some functions (sqlite3_open_v2(), sqlite3_prepare_v2()) appear on the later SQLite versions (I suppose 3.5.x and later).
The SQLite table tabl in file sqlite.db can be created with (for example)
CREATE TABLE tabl(a TEXT PRIMARY KEY, b TEXT, c TEXT, d TEXT);
You'll want to use this function with a prepared statement.
int sqlite3_bind_blob(sqlite3_stmt*, int, const void*, int n, void(*)(void*));
In C/C++, the standard way of dealing with NULLs in strings is to either store the beginning of the string and a length, or store a pointer to the beginning of a string and one to the end of the string.
You want to precompile the statement sqlite_prepare_v2(), and then bind the blob in using sqlite3_bind_blob(). Note that the statement you bind in will be INSERT INTO table VALUES (?).