I am using sqlite database in an embedded system application. I have two sqlite databases. One database(DB1) has no issues however the other one(DB2) works fine only for a while.
The issue with DB2 is that query on the database starts failing with SQLITE_CORRUPT error. However, the write operation still keeps going on fine forever. So, the issue is that after a while or after it starts growing in size, I can not query the database anymore.
On the other hand, DB1 has no such issues. The difference between DB1 and DB2 is that the table in DB1 is created just for the float entries and the table in DB2 contains 3-4 text entries along with int entries.
I am not able to find the reason as why text entries causes the sqlite database to get corrupted.
sqlite version 3.4.5
Related
I have a SQlite db-file of 30 GB. This is to large to import into PowerBi Desktop (I don't have a premium-licence), so I want to use DirectQuery.
This doensn't work with SQLite (see link below).
[https://learn.microsoft.com/nl-nl/power-bi/connect-data/power-bi-data-sources][1]
I think it might be possible to use MariaDB (= opensource). I never used this database and it isn't clear to me if the database needs to be installed or that it uses a single database file like sqlite (file.db) that can easily be moved around.
Any advice? On PowerBi or the database?
I am testing MariaDB for a possible replacement of a MySQL data warehouse. This data warehouse is rebuilt nightly from a legacy database.
The basic process is to generate XML documents/files from the legacy database and then use DROP TABLE, create table from DDL, LOAD XML LOCAL INFILE 'xml file'. A few of the XML files are 60-100 megabytes (about 300K rows). On MySQL these tables take a couple minutes. On MariaDB these tables take significantly longer (e.g. 83 megabyte XML file requires 16 minutes on MariaDB, MySQL less than 1 minute), and the times seem to grow exponentially with file size.
I have read and followed the KB topic How to Quickly Insert Data Into MariaDB and have tried the suggestions there with no real change. Since the MariaDB tables are dropped and recreated immediately before the LOAD XML LOCAL INFILE, several performance improvements should be triggered.
I have not tried LOCK TABLE yet.
What can I try to improve performance? Don't want to return to MySQL but this issue is a deal killer.
Environment is RHEL 8, MariaDB 10.5.16
Used DISABLE KEYS/LOAD.../ENABLE KEYS with no apparent benefit.
Increased max_allowed_packet_size with no effect.
I'm going to end-up with a rather large database CubeSQLite in the cloud and cloned on the local machine. In my current databases I already have 185 tables and growing. I store them in 6 SQLite databases and begin by attaching them together Using the ATTACH DATABASE command. There are views that point to information in other databases and, as a result, Navicat won't open the SQLite tables individually. It finds them to be corrupted, although they are not and are working fine.
My actual question is this:
Considering the potential size of the files, is it better/faster/slower to do it this way or to put them all into one really large SQLite DB?
Excuse for English
I using SQLite and test it. Insert multi-million row for testing speed. and deletes rows after any insert.
But i know my database size is 33.0 MB..... now database is empty. but size on disk is 33 MB.
WHY?
can you help me?
The VACUUM command rebuilds the entire database. There are several
reasons an application might do this:
Unless SQLite is running in "auto_vacuum=FULL" mode, when a large amount of data is deleted from the database file it leaves behind
empty space, or "free" database pages. This means the database file
might be larger than strictly necessary. Running VACUUM to rebuild the
database reclaims this space and reduces the size of the database
file.
https://www.sqlite.org/lang_vacuum.html
I am working on a firm application in which I need to create a local database on my device.
I create my local database through create statement[ It works well]
Then I use that file and perform insert operation through fire-fox sqlite plugin, I need to insert aprox 2000 rows at a time so I can not use code. I just run insert manually through sqlite plugin in fir-fox.
After that I just use that file in my place of my local database.
When I run select query through my code, It show Exception:java.lang.Exception: Exception: In create or prepare statement in DBnet.rim.device.api.database.DatabaseException: SELECT distinct productline FROM T_Electrical ORDER BY productline: file is encrypted or is not a database
I got the solution of this problem, I was doing a silly mistake by creating a file manually by right click in my RES folder, that is not correct. We need to create the database completely from SQlite plugin, then it will work fine. "Create data base from SQLITE(FIle too) and perform insertion operation from SQLITE, then it will work fine"
This is very rare problem, but i think it might be helpful for someone like me....!:)
You should check to see if there is a version problem between the SQLite used by your Firefox installation and that on the BlackBerry. I think I had the same error when I tried to build a database file with SQLite version 2.
You also shouldn't need to create the database file on the device. To create large tables I use a Ubuntu machine and the sqlite3 command line. Create the file, create the tables, insert the data and build indexes. Then I just copy the file onto the device in the proper directory.
For me it was a simple thing. One password was set to that db. I just used it and prolem got solved.