Import fixed-width text file into sqlite - sqlite

What is a good way to import fixed-width text files into sqlite tables, preferably without using peripheral software?
E.g. specifying the width of each field
Field 1: 10 characters starting with the second character
Field 2: 5 characters starting with the twelfth character
Field 3: 7 characters starting with the eighteenth character
The line
AABCDEFGHIJABCDEAABCDEFGA
would be imported as:
Field 1 Field 2 Field 3
ABCDEFGHIJ ABCDE ABCDEFG
Thanks

The link in the answer above is for generic SQL. Here is one way to do it in SQLite:
CREATE TABLE fixed_width_table (full_string CHAR);
.import fixed_width_file.txt fixed_width_table
CREATE TABLE tmp AS
Select
SUBSTR(full_string,1,11) AS field1
,SUBSTR(full_string,2,5) AS field2
,SUBSTR(full_string,2,7) AS field3
FROM fixed_width_table

The sqlite3 tools imports only CSV files.
There are third-party tools that can import fixed-width files, but this answer shows how to do this inside SQLite with string functions.

To import a text file with a fixed length
Import the whole file in a table TestImport with 1 column extract
Write the sql statements you need to query or data-transform the data
Do additional work for all etl needs.
Step 1: Import from your text-file and save it to a db-file.
sqlite> .import C:/yourFolder/text_flat.txt TestImport
sqlite> .save C:/yourFolder/text_flat_out.db
And now you can do all sorts of etl with it (step 2 and 3).

Related

How to import csv file into sqlite except the first row of csv?

I am trying to import a CSV file into my SQLite table.I have created my SQLite table as:
CREATE TABLE car(id INTEGER PRIMARY KEY, name TEXT, model TEXT);
My CSV file is cars.csv:
Id Name Model
1 Car 1 BMW
2 Car 2 Mercedes
3 Car 3 BMW
Now, I am importing the CSV into SQLite using .import cars.csv but it imports all the 4 rows of the CSV file. I am not able to figure out how to import the CSV file without the first row of headers.
With the sqlite3 shell's .import command, if the first character of a quote-enclosed filename is a |, the rest of the filename is instead treated as a shell command that is executed to produce the data to be imported. So, what I do in this situation is:
sqlite> .import '| tail -n +2 cars.csv' car
The tail invocation will print all but the first line of the file.
If you're using Sqlite 3.32.0 or newer (Released May 2020), the shell can natively ignore a given number of initial lines:
sqlite> .import -skip 1 cars.csv car
It also accepts a --csv option to force CSV mode for just that import, without having to do a .mode csv first.
if you can skip the create table step and import the file into a new table which does not exist before, you can import the file, create the table and skip the header row all in one step, if you must create the table before, like in case you do multiple imports of multiple files into same table, then the only option available seems to be import everything and delete the record associated with the first header row ( you know values in there anyway, so it is easy to find and delete ), see here for examples:
SQLite3 Import CSV & exclude/skip header

Hi, I need to know how to insert a large amount of text into SQLite because I create an application for a book with Ionic

"i am developing a mobile application with Ionic for a book that has a lot of content, grouping in
several parts, subpart, title and chapter. I chose to use SQLite, and I'm looking for a way to
load all its contents into my SQLite database and if anyone has an idea or suggestion to help me
do things well I'll be delighted."
There are several ways to import a text file, e.g. as a BLOB, on a line-to-row basis, using sqlite's "archive" mode, and so on.
If you want the entire file as a single TEXT cell in a table, then (subject to the limitation described below) you can use .import by carefully selecting values for the separators.
Multibyte separators are not supported when importing, but control characters such as Control-A and Control-B can be used.
So you could proceed like so:
CREATE TABLE text("text" text)';
.separator ^A ^B
.import text.txt text
where ^A should be replaced by a literal control-A, and similarly for ^B.
Limitation
The maximum number of bytes in a string or BLOB in SQLite is defined by the preprocessor macro SQLITE_MAX_LENGTH. The default value of this macro is 1 billion (1 thousand million or 1,000,000,000).
The following illustrates how to use the python sqlite3 module to import a (UTF-8) text file:
import sqlite3
def create_table():
conn.execute("""CREATE TABLE IF NOT EXISTS mytext( mytext TEXT );""")
def insert_file(filename):
sql = "INSERT INTO mytext(mytext) VALUES(?)"
cursor.execute(sql, (open(filename, "r").read(),))
conn.commit()
db_file_name = 'text-file.db'
conn = sqlite3.connect(db_file_name)
cursor = conn.cursor()
create_table()
insert_file("text.txt")
conn.close()
(Tested with python3.)

Is it possible to import a CSV file to an existing table without the headers being included?

I'm trying to import a CSV file to a table that is empty but already exists in an SQLite database. For example:
sqlite> CREATE TABLE data (...);
sqlite> .mode csv
sqlite> .import mydata.csv data
I have created the table in advance because I'd like to specify a primary key, data types, and foreign key constraints. This process works as expected, but it unfortunately includes the header row from the CSV file in the table.
Here's what I've learned from the SQLite docs regarding CSV imports:
There are two cases to consider: (1) Table "tab1" does not previously exist and (2) table "tab1" does already exist.
In the first case, when the table does not previously exist, the table is automatically created and the content of the first row of the input CSV file is used to determine the name of all the columns in the table. In other words, if the table does not previously exist, the first row of the CSV file is interpreted to be column names and the actual data starts on the second row of the CSV file.
For the second case, when the table already exists, every row of the CSV file, including the first row, is assumed to be actual content. If the CSV file contains an initial row of column labels, that row will be read as data and inserted into the table. To avoid this, make sure that table does not previously exist.
So basically, I get extra data because I've created the table in advance. Is there a flag to change this behavior? If not, what's the best workaround?
The sqlite3 command-line shell has no such flag.
If you have a sufficiently advanced OS, you can use an external tool to split off the first line:
sqlite> .import "|tail -n +2 mydata.csv" data
You can also use the --skip 1 option with .import as documented on the sqlite3 website and this SO Answer. So, you can use the following command
.import --csv --skip 1 mydata.csv data

How to import a tsv file with SQLite3

I have a tsv (tab separated file) that I would like to import with sqlite3. Does someone know a clear way to do it?
I have installed sqlite3, but not created any database or tables yet.
I've tried the command
.import /path/filename.tsv my_new_table
but it gives me the error: no such table: my_new_table.
However, from what I'd read it should create the table automatically if it does't exist. Does it mean I need to create and use a database first, or is there another trick to importing a .tsv file into sqlite?
There is actually a dedicated mode for importing tab separated files:
sqlite> .mode tabs
sqlite> .import data.tsv people
Also if you include a header row in your tsv file, you can let sqlite automatically create the table.
Just use an unused table-name during import and change the tsv file to:
name param1 param2
Bob 30 1000
Wendy 20 900
You should create the table, set a separator and import the data (sqlite docs).
Example for TSV:
data.tsv (tab as a separator):
Bob 30 1000
Wendy 20 900
Create a table and set TAB as a separator:
sqlite> create table people (name text, param1 int, param2 int);
sqlite> .separator "\t"
Import data:
sqlite> .import data.tsv people
And the result is:
sqlite> select * from people;
Bob 30 1000
Wendy 20 900

How to specify row separators when importing into an sqlite db from a csv with non-default field and row separators?

I have a bunch of data that i exported from mssql using bcp with custom field and row separators. I would like to import the data into an sqlite database. . Is there an easy way to do this with .import and .separator ? . Or do I need to use a newline as my row separator, alter the .import source, or make insert statments for each row...
Individual records should be on a new line.
Setting .separator will arrange the field separator. Do not quote, just type in your separating character after a single space.
To start the import, use .import FILE TABLE
I just tried the above solution for a text file containing records with "|" as the field separator and the file was saved as C:\temp\test.txt and here are the commands that worked:
SQLite> .separator |
SQLite> .import C:\temp\test.txt some_table
The above 2 commands loaded the data from the test.txt file to my "some_table" in my SQLite database.
IMPORT works great for small number of rows. It jammed the data for the large number of records. It worked for 2500 records but failed for 5300 records.

Resources