A little new to using sql, but am running into this problem.
I'm trying to create an sql table using sqlite3 for a matrix of months, then insert a csv file into my sql table.
My csv file looks like this with months as both the rows and the columns:
Run the SQLite shell program and create a new database:
sqlite3 monthdatabase.sqlite
In the shell program create a table for your month data:
CREATE TABLE monthdata (
row_id INTEGER,
col01 INTEGER,
col02 INTEGER,
col03 INTEGER,
col04 INTEGER,
col05 INTEGER,
col06 INTEGER,
col07 INTEGER,
col08 INTEGER,
col09 INTEGER,
col10 INTEGER,
col11 INTEGER,
col12 INTEGER,
PRIMARY KEY(row_id)
);
Check that the (empty) table is there:
SELECT * FROM monthdata;
The SQLite command line shell has a CSV import mode. It roughly works like this:
.import path/to/file.csv tablename
The import mode works seamless if the CSV file structure is identical to the table structure.
Depending on the delimiter in your CSV file you have to adjust this first:
.separator ";"
.import path/to/monthdata.csv monthdata
Related
I'm having a datatype mismatch error when I try to import data from a csv file that has integers only into a table in my database that has two columns of integer data types. I can't seem to pinpoint what is causing the issue. I even specified the separator to be comma in sqlite3
Below is a sample from my CSV
48,2015
125,2015
55,2015
Below is my table schema
CREATE TABLE Cars (
Car_ID INTEGER PRIMARY KEY,
Year INTEGER
);
In an SQLite database, I created a table and imported CSV data into it.
The last column only contains integers:
ZIP;AMOUNT
78123;4272
95456;154
etc.
I used the following commands:
.mode csv
.separator ';'
CREATE TABLE MyTable("ZIP" TEXT, "AMOUNT" INTEGER);
.import input.csv MyTable
sqlite> select SUM(AMOUNT) from MyTable;
25270.0
Why is SQLite displaying SUM with a decimal?
Thank you.
===
Edit: Here's the infos:
sqlite> select typeof(AMOUNT) from MyTable LIMIT 10;
text
integer
integer
integer
integer
etc.
sqlite> select typeof(SUM(AMOUNT)) from MyTable;
real
==
Edit: Here's the top of input.csv as exported from LibreOffice Calc:
ZIP;AMOUNT
78123;4272
95456;154
etc.
Maybe I didn't use the right commands to import data into SQLite.
#PetSerAl has it. Because you're importing into an existing table, every line is imported including the header. That string makes sum() return a floating point result.
I work around this case with:
.import '| tail -n +2 input.csv' mytable
which strips the first line.
If the first character of the filename is a |, it's treated as a shell command to run (Using the C popen() function) and its output is used as the data to import.
I would like to edit Firefox cookies using Bash on OS X. I can use the following to convert cookies.sqlite to an ASCII text file:
cd ~/Library/Firefox; sqlite3 cookies.sqlite .dump > test
However I have not yet found a way to convert an edited ASCII text file back to cookies.sqlite. I have tried both dump import and CSV import - sections 8 and 10 on https://www.sqlite.org/cli.html
I suspect the main problem is the format of cookies.sqlite. The following is an example ASCII dump:
PRAGMA foreign_keys=OFF;
BEGIN TRANSACTION;
CREATE TABLE moz_cookies (id INTEGER PRIMARY KEY, baseDomain TEXT, originAttributes TEXT NOT NULL DEFAULT '', name TEXT, value TEXT, host TEXT, path TEXT, expiry INTEGER, lastAccessed INTEGER, creationTime INTEGER, isSecure INTEGER, isHttpOnly INTEGER, appId INTEGER DEFAULT 0, inBrowserElement INTEGER DEFAULT 0, CONSTRAINT moz_uniqueid UNIQUE (name, host, path, originAttributes));
INSERT INTO moz_cookies VALUES(33,'google.com','','CONSENT','WP.27b523','.google.com','/',2145916800,1561389135468630,1561365552747342,0,0,0,0);
INSERT INTO moz_cookies VALUES(115,'stackoverflow.com','','_gat','1','.stackoverflow.com','/',1561389104,1561389044656946,1561389044656946,0,0,0,0);
INSERT INTO moz_cookies VALUES(117,'stackoverflow.com','','usr','p=[2|6]','stackoverflow.com','/',1577200300,1561389100380300,1561389043655888,1,1,0,0);
INSERT INTO moz_cookies VALUES(120,'google.com','','1P_JAR','2019-06-24-15','.google.com','/',1563981135,1561389135573521,1561365552746756,0,0,0,0);
CREATE INDEX moz_basedomain ON moz_cookies (baseDomain, originAttributes);
COMMIT;
Any assistance would be appreciated.
Just redirect the file to sqlite3's standard input:
sqlite3 cookies.sqlite < test
You'll want to drop existing tables first to avoid all sorts of problems with duplicates though.
Another alternative is to use .read FILENAME from in the sqlite shell.
I want to import csv file into SQLite db using
sqlite> .separator ,
sqlite> .mode csv data
sqlite> .import test.csv data
where data is the table name with three columns, just like the file.
The file has some string value that are encapsulated using double quotes.
Some of the string values have commas in them (actual example from the file "Bond\, James") which should be treated as a single column, but SQLite produces an error
Error: test.csv line 2: expected 3 columns of data but found 4
How can I make SQLite import these values correctly?
I know this is a bit old, but this was the first relevant google search result, so I wanted to share my solution.
Use a different separator, and remove the quotes around values.
sed -i -e 's/","/|/g' -e 's/"$//g' -e 's/^"//g' file.csv
sqlite> .separator "|"
sqlite> .import file.csv tablename
SQLite's .import will accept a CSV line like this
fee, fi,"fo, fum"
provided that there are no space between the preceding comma and the string that is enclosed in quotes.
Since the following has a space between fi, and "fo
fee, fi, "fo, fum"
it will produce an error like:
expected 3 columns but found 4 - extras ignored
If anyone is wondering why this is the case, this was the response of Richard Hipp, author of SQLite, in two mails dated 21st May 2019 to the sqlite-users mailing list, in the thread 'CSV import does not handle fields with a comma surrounded by double'. (It should have been "double quotes", but I forgot the last word.) He wrote:
This is not valid CSV. There is an extra space character after the comma and before the double-quote.
And then
I'm going by RFC 4180. https://tools.ietf.org/html/rfc4180. On page 2 it says: "Spaces are considered part of a field and should not be ignored."
(In case anyone is wondering why I posted an Internet Archive copy of a third-party/unofficial archive, the IA copy is just from an abundance of caution. The unofficial archive is because, as far as I can tell, an official mailing list archive does not exist. The mailing list itself was discontinued some time ago.)
So the logic is that the string is to be surrounded by whitespace, it should surround the leading space too.
Transcript session follows.
###################
## incorrect.csv ##
###################
fee, fi, "fo, fum"
#################
## correct.csv ##
#################
fee, fi,"fo, fum"
##############################################
## test.sh ##
##############################################
echo "Importing incorrect.csv into test.db"
sqlite3 test.db '.mode csv' 'DROP TABLE IF EXISTS incorrect;' 'CREATE TABLE IF NOT EXISTS incorrect(col1 TEXT PRIMARY KEY, col2 TEXT NOT NULL, col3 TEXT NOT NULL);' '.import incorrect.csv incorrect' '.exit'
echo
echo "Importing correct.csv into test.db"
sqlite3 test.db '.mode csv' 'DROP TABLE IF EXISTS correct;' 'CREATE TABLE IF NOT EXISTS correct(col1 TEXT PRIMARY KEY, col2 TEXT NOT NULL, col3 TEXT NOT NULL);' '.import correct.csv correct' '.exit'
echo
echo "Result of 'select * from incorrect'"
sqlite3 test.db 'select * from incorrect' '.exit'
echo
echo "Result of 'select * from correct'"
sqlite3 test.db 'select * from correct' '.exit'
$ sh test.sh
Importing incorrect.csv into test.db
incorrect.csv:1: expected 3 columns but found 4 - extras ignored
Importing correct.csv into test.db
Result of 'select * from incorrect'
fee| fi| "fo
Result of 'select * from correct'
fee| fi|fo, fum
I've experienced this issue myself and found it much much easier to modify my script so that it dumps sql queries as opposed to csv delimited values.
There are problems importing csv data into sqlite3 not only with commas, but also with new line characters.
I would suggest the following:
Modify your script to produce sql dumps
Convert the csv dump to sql
queries and feed it to sqlite3
I have a CSV file that has 2999 rows but on importing it to a table in sqlite3, I get only 1363 rows. The following are the set of commands/queries I'm using. Unfortunately I cannot link to the raw data here for confidentiality reasons. Given that, can anybody point out what I may be missing or if there is any limit to import sizes (sorry, Google didn't help me)? Thanks in advance.
sqlite> CREATE TABLE test (var1 integer, var2 integer, var3 varchar(50));
sqlite> .separator ","
sqlite> .import data-v1.csv test
sqlite> select count(*) from test;
The output is 1363 when it should have been 2999.
I'm dumb ... there was a ^M instead of a newline character at a bunch of rows (not everywhere).