Octave: Read BLOB data from sqlite db file - sqlite

I try to read BLOB data from a database file with Octave 6.2.0 and the mex-sqlite3-master package.
I am able to select and read any other data from my database file. For the column containing BLOB data it gives me the following:
octave> x=sqlite3('file.db', 'SELECT column FROM list');
error: sqlite3: unsupported column type
octave> x=sqlite3('file.db', 'SELECT column FROM list WHERE column=CAST(column AS TEXT)');
gives no error, however x with a dimension of 1x0.
The BLOB-data contains hexadecimal numbers. I am fine having them as string (and work my way further, no probs).
What can I do to extract the BLOB-data in a processable format?
Thanks for any hint!

thanks a lot for your answers! I tried the "select CAST(column AS TEXT) from bl;" hint. It created an array of the expected size, however with empty cells.

I think you might want:
SELECT hex(column) FROM list
instead of doing a CAST to TEXT.
See Sqlite: How to cast(data as TEXT) for BLOB.

Related

Pasting SQL decimal columns into Excel

I have a issue with data formats of Excel and SQL.
I have a column in SQL which is of datatype DECIMAL(18,0) and when I am trying to paste the result in SQL..the last 3 digits of the sql result gets replaced by 0 in Excel.
Example:
In SQL the result set has a column called session id and has decimal numbers like
119,597,417,242,309,670
329,621,151,415,350,454
134,460,940,261,658,890
but when I paste it in Excel the numbers look like:
I tried changing the format in EXCEL to paste as text however, the whole format of the result set gets distorted (and only the first column gets pasted properly without the 0's)
I can't keep casting all columns in SQL from decimal to int as there are way too many columns.
Can you please guide me as to what I can do?
Numeric fields in Excel are limited to 15 digits precision.
In SQL Assistant under Tools / Options / Data Format you can ask to have large Decimal (and BIGINT) fields displayed as text for just this sort of copy / paste. Or you can tell SQL Assistant to Save As or Export to Excel format.
For other tools you can explicitly FORMAT and CAST the data to VARCHAR in your SELECT so it is retrieved as text.
Several things you can do. I'll list 4.
Pick whatever suits you best.
First paste in a text editor (like notepad), seach/replace there, and paste that.
Set the datarange where you're going to paste to "text", and then paste. After that you can search/replace, and change to the correct format.
Change the regional settings of Windows to match the data that you have.
You can generate formula's from your SQL query, instead of floating point numbers. So generate a text like =5/10 instead of 0.5 or 0,5. Excel will pick it up correctly regardless of your regional settings.

Adding value to existing database table in RSQLite

I am new to RSQLite.
I have an input document in text format in which values are seperately by '|'
I created a table with the required variables (dummy code as follows)
db<-dbconnect(SQLite(),dbname="test.sqlite")
dbSendQuery(conn=db,
"CREATE TABLE TABLE1(
MARKS INTEGER,
ROLLNUM INTEGER
NAME CHAR(25)
DATED DATE)"
)
However I am struck at how to import values into the created table.
I cannot use INSERT INTO Values command as there are thousands of rows and more than 20+ columns in the original data file and it is impossible to manually type in each data point.
Can someone suggest an alternative efficient way to do so?
You are using a scripting language. The deal of this is literally to avoid manually typing each data point. Sorry.
You have two routes:
1: You have corrected loaded a database connection and created an empty table in your SQLite database. Nice!
To load data into the table, load your text file into R using e.g. df <-
read.table('textfile.txt', sep='|') (modify arguments to fit your text file).
To have a 'dynamic' INSERT statement, you can use placeholders. RSQLite allows for both named or positioned placeholder. To insert a single row, you can do:
dbSendQuery(db, 'INSERT INTO table1 (MARKS, ROLLNUM, NAME) VALUES (?, ?, ?);', list(1, 16, 'Big fellow'))
You see? The first ? got value 1, the second ? got value 16, and the last ? got the string Big fellow. Also note that you do not enclose placeholders for text in quotation marks (' or ")!
Now, you have thousands of rows. Or just more than one. Either way, you can send in your data frame. dbSendQuery has some requirements. 1) That each vector has the same number of entries (not an issue when providing a data.frame). And 2) You may only submit the same number of vectors as you have placeholders.
I assume your data frame, df contains columns mark, roll, and name, corrsponding to the columns. Then you may run:
dbSendQuery(db, 'INSERT INTO table1 (MARKS, ROLLNUM, NAME) VALUES (:mark, :roll, :name);', df)
This will execute an INSERT statement for each row in df!
TIP! Because an INSERT statement is execute for each row, inserting thousands of rows can take a long time, because after each insert, data is written to file and indices are updated. Insert, enclose it in an transaction:
dbBegin(db)
res <- dbSendQuery(db, 'INSERT ...;', df)
dbClearResult(res)
dbCommit(db)
and SQLite will save the data to a journal file, and only save the result when you execute the dbCommit(db). Try both methods and compare the speed!
2: Ah, yes. The second way. This can be done in SQLite entirely.
With the SQLite command utility (sqlite3 from your command line, not R), you can attach a text file as a table and simply do a INSERT INTO ... SELECT ... ; command. Alternately, read the text file in sqlite3 into a temporary table and run a INSERT INTO ... SELECT ... ;.
Useful site to remember: http://www.sqlite.com/lang.html
A little late to the party, but DBI provides dbAppendTable() which will write the contents of a dataframe to an SQL table. Column names in the dataframe must match the field names in the database. For your example, the following code would insert the contents of my random dataframe into your newly created table.
library(DBI)
db<-dbConnect(RSQLite::SQLite(),dbname=":memory")
dbExecute(db,
"CREATE TABLE TABLE1(
MARKS INTEGER,
ROLLNUM INTEGER,
NAME TEXT
)"
)
df <- data.frame(MARKS = sample(1:100, 10),
ROLLNUM = sample(1:100, 10),
NAME = stringi::stri_rand_strings(10, 10))
dbAppendTable(db, "TABLE1", df)
I don't think there is a nice way to do a large number of inserts directly from R. SQLite does have a bulk insert functionality, but the RSQLite package does not appear to expose it.
From the command line you may try the following:
.separator |
.import your_file.csv your_table
where your_file.csv is the CSV (or pipe delimited) file containing your data and your_table is the destination table.
See the documentation under CSV Import for more information.

teradata : to calulate cast as length of column

I need to use cast function with length of column in teradata.
say I have a table with following data ,
id | name
1|dhawal
2|bhaskar
I need to use cast operation something like
select cast(name as CHAR(<length of column>) from table
how can i do that?
thanks
Dhawal
You have to find the length by looking at the table definition - either manually (show table) or by writing dynamic SQL that queries dbc.ColumnsV.
update
You can find the maximum length of the actual data using
select max(length(cast(... as varchar(<large enough value>))) from TABLE
But if this is for FastExport I think casting as varchar(large-enough-value) and postprocessing to remove the 2-byte length info FastExport includes is a better solution (since exporting a CHAR() will results in a fixed-length output file with lots of spaces in it).
You may know this already, but just in case: Teradata usually recommends switching to TPT instead of the legacy fexp.

SQLite: result of mathematical operation always is a text

I created the following table:
CREATE TABLE test (a INT,b INT);
After I inserted some data:
INSERT INTO test VALUES(1,2);
When I execute this SELECT:
SELECT cast(b as real) as x, a * b as y FROM teste
the fields "x" and "y" return with datatype TEXT. I'm using Delphi and SQLiteStudio 2.1.5, and both return the same datatype.
I need that field x being real and y being int.
Someone could help me?
Sqlite does not have column data type. It is just text. Based on your expression, it will try to do the conversion for you. For your math expression, you need to do a cast to give it is a hint to delphi to create your desired column type.
For me it's strange that in Sqlite, column types are "recommendations". But analyzing this I changed my program. I created the fields manually with correct type. So the program start to work.
The problem is when you open a query and the Delphi create the columns automatically. In this case, the columns with cast or with mathematical operation are created like text field.

How do I find the length (size) of a binary blob?

I have an SQLite table that contains a BLOB I need to do a size/length check on. How do I do that?
According to documentation length(blob) only works on texts and will stop counting after the first NULL. My tests confirmed this. I'm using SQLite 3.4.2.
I haven't had this problem, but you could try length(hex(glob))/2
Update (Aug-2012):
For SQLite 3.7.6 (released April 12, 2011) and later, length(blob_column) works as expected with both text and binary data.
for me length(blob) works just fine, gives the same results like the other.
As an additional answer, a common problem is that sqlite effectively ignores the column type of a table, so if you store a string in a blob column, it becomes a string column for that row. As length works different on strings, it will then only return the number of characters before the final 0 octet. It's easy to store strings in blob columns because you normally have to cast explicitly to insert a blob:
insert into table values ('xxxx'); // string insert
insert into table values(cast('xxxx' as blob)); // blob insert
to get the correct length for values stored as string, you can cast the length argument to blob:
select length(string-value-from-blob-column); // treast blob column as string
select length(cast(blob-column as blob)); // correctly returns blob length
The reason why length(hex(blob-column))/2 works is that hex doesn't stop at internal 0 octets, and the generated hex string doesn't contain 0 octets anymore, so length returns the correct (full) length.
Example of a select query that does this, getting the length of the blob in column myblob, in table mytable, in row 3:
select length(myblob) from mytable where rowid=3;
LENGTH() function in sqlite 3.7.13 on Debian 7 does not work, but LENGTH(HEX())/2 works fine.
# sqlite --version
3.7.13 2012-06-11 02:05:22 f5b5a13f7394dc143aa136f1d4faba6839eaa6dc
# sqlite xxx.db "SELECT docid, LENGTH(doccontent), LENGTH(HEX(doccontent))/2 AS b FROM cr_doc LIMIT 10;"
1|6|77824
2|5|176251
3|5|176251
4|6|39936
5|6|43520
6|494|101447
7|6|41472
8|6|61440
9|6|41984
10|6|41472

Resources