Comma to point to show decimal in SqLite3 - sqlite

I've got an SQLite database that I populate directly from txt files. However, my textfiles has commas to show decimal. After insepcting the already appointed records, this leads to confusion as SQLite don't interpret these numbers correctly.
Is it possible to change records with a comma to a point in place (or should I rather populate the database over again?

If you want to have repeatable and consistent processes, you should fix your import and execute it again.
If you want to change the characters in place, use the replace() function:
UPDATE MyTable
SET MyColumn = replace(MyColumn, ',', '.')
WHERE MyColumn LIKE '%,%';
If you want the result to be numbers, you also have to change the type with CAST:
UPDATE MyTable
SET MyColumn = CAST(replace(MyColumn, ',', '.') AS NUMERIC)
WHERE MyColumn LIKE '%,%';

Related

Show negative real in SQLite table

I have a column C of type REAL in table F in SQLite. I want to join this everywhere where in another table the negative value of F exists (along with some other fields).
However -C or 0-C etc.. all return the rounded value of C e.g. when C contains "123,456" then -C returns "-123".
Should I cast this via a string first or is the syntax differently?
Looks like the , in 123,456 is meant to be a decimal separator but SQLite treats the whole thing as a string (i.e. '123,456' rather than 123.456). Keep in mind that SQLite's type system is a little different than SQL's as values have types but columns don't:
[...] In SQLite, the datatype of a value is associated with the value itself, not with its container. [...]
So you can quietly put a string (that looks like a real number in some locales) into a real column and nothing bad happens until later.
You could fix the import process to interpret the decimal separator as desired before the data gets into SQLite or you could use replace to fix them up as needed:
sqlite> select -'123,45';
-123
sqlite> select -replace('123,45', ',', '.');
-123.45

Adding value to existing database table in RSQLite

I am new to RSQLite.
I have an input document in text format in which values are seperately by '|'
I created a table with the required variables (dummy code as follows)
db<-dbconnect(SQLite(),dbname="test.sqlite")
dbSendQuery(conn=db,
"CREATE TABLE TABLE1(
MARKS INTEGER,
ROLLNUM INTEGER
NAME CHAR(25)
DATED DATE)"
)
However I am struck at how to import values into the created table.
I cannot use INSERT INTO Values command as there are thousands of rows and more than 20+ columns in the original data file and it is impossible to manually type in each data point.
Can someone suggest an alternative efficient way to do so?
You are using a scripting language. The deal of this is literally to avoid manually typing each data point. Sorry.
You have two routes:
1: You have corrected loaded a database connection and created an empty table in your SQLite database. Nice!
To load data into the table, load your text file into R using e.g. df <-
read.table('textfile.txt', sep='|') (modify arguments to fit your text file).
To have a 'dynamic' INSERT statement, you can use placeholders. RSQLite allows for both named or positioned placeholder. To insert a single row, you can do:
dbSendQuery(db, 'INSERT INTO table1 (MARKS, ROLLNUM, NAME) VALUES (?, ?, ?);', list(1, 16, 'Big fellow'))
You see? The first ? got value 1, the second ? got value 16, and the last ? got the string Big fellow. Also note that you do not enclose placeholders for text in quotation marks (' or ")!
Now, you have thousands of rows. Or just more than one. Either way, you can send in your data frame. dbSendQuery has some requirements. 1) That each vector has the same number of entries (not an issue when providing a data.frame). And 2) You may only submit the same number of vectors as you have placeholders.
I assume your data frame, df contains columns mark, roll, and name, corrsponding to the columns. Then you may run:
dbSendQuery(db, 'INSERT INTO table1 (MARKS, ROLLNUM, NAME) VALUES (:mark, :roll, :name);', df)
This will execute an INSERT statement for each row in df!
TIP! Because an INSERT statement is execute for each row, inserting thousands of rows can take a long time, because after each insert, data is written to file and indices are updated. Insert, enclose it in an transaction:
dbBegin(db)
res <- dbSendQuery(db, 'INSERT ...;', df)
dbClearResult(res)
dbCommit(db)
and SQLite will save the data to a journal file, and only save the result when you execute the dbCommit(db). Try both methods and compare the speed!
2: Ah, yes. The second way. This can be done in SQLite entirely.
With the SQLite command utility (sqlite3 from your command line, not R), you can attach a text file as a table and simply do a INSERT INTO ... SELECT ... ; command. Alternately, read the text file in sqlite3 into a temporary table and run a INSERT INTO ... SELECT ... ;.
Useful site to remember: http://www.sqlite.com/lang.html
A little late to the party, but DBI provides dbAppendTable() which will write the contents of a dataframe to an SQL table. Column names in the dataframe must match the field names in the database. For your example, the following code would insert the contents of my random dataframe into your newly created table.
library(DBI)
db<-dbConnect(RSQLite::SQLite(),dbname=":memory")
dbExecute(db,
"CREATE TABLE TABLE1(
MARKS INTEGER,
ROLLNUM INTEGER,
NAME TEXT
)"
)
df <- data.frame(MARKS = sample(1:100, 10),
ROLLNUM = sample(1:100, 10),
NAME = stringi::stri_rand_strings(10, 10))
dbAppendTable(db, "TABLE1", df)
I don't think there is a nice way to do a large number of inserts directly from R. SQLite does have a bulk insert functionality, but the RSQLite package does not appear to expose it.
From the command line you may try the following:
.separator |
.import your_file.csv your_table
where your_file.csv is the CSV (or pipe delimited) file containing your data and your_table is the destination table.
See the documentation under CSV Import for more information.

How to query Unicode characters from SQL Server 2008

With NVARCHAR data type, I store my local language text in a column. I face a problem how to query that value from the database.
ዜናገብርኤልስ is stored value.
I wrote SQL like this
select DivisionName
from t_Et_Divisions
where DivisionName = 'ዜናገብርኤልስ'
select unicode (DivisionName)
from t_Et_Divisions
where DivisionName = 'ዜናገብርኤልስ'
The above didn't work. Does anyone have any ideas how to fix it?
Thanks!
You need to prefix your Unicode string literals with a N:
select DivisionName
from t_Et_Divisions
where DivisionName = N'ዜናገብርኤልስ'
This N prefix tells SQL Server to treat this string literal as a Unicode string and not convert it to a non-Unicode string (as it will if you omit the N prefix).
Update:
I still fail to understand what is not working according to you....
I tried setting up a table with an NVARCHAR column, and if I select, I get back that one, exact row match - as expected:
DECLARE #test TABLE (DivisionName NVARCHAR(100))
INSERT INTO #test (DivisionName)
VALUES (N'ዜናገብርኤልስ'), (N'ዜናገብርኤልስ,ኔትዎርክ,ከስተመር ስርቪስ'), (N'ኔትዎርክ,ከስተመር ስርቪስ')
SELECT *
FROM #test
WHERE DivisionName = N'ዜናገብርኤልስ'
This returns exactly one row - what else are you seeing, or what else are you expecting??
Update #2:
Ah - I see - the columns contains multiple, comma-separated values - which is a horrible design mistake to begin with..... (violates first normal form of database design - don't do it!!)
And then you want to select all rows that contain that search term - but only display the search term itself, not the whole DivisionName column? Seems rather pointless..... try this:
select N'ዜናገብርኤልስ'
from t_Et_Divisions
where DivisionName LIKE N'%ዜናገብርኤልስ%'
The LIKE searches for rows that contain that value, and since you already know what you want to display, just put that value into the SELECT list ....

SQLite GROUP BY

I am using DB Browser for SQLite to extract some interesting data from my database but I encountered one big problem with GROUP BY statement.
Even the most basic SELECT I can imagine is not working properly.
(Filename nvarchar(2147483647))
SELECT FileName FROM TableName WHERE FileName LIKE '%Nieminen%' GROUP BY FileName gives 5 rows even though I know that there are 9 distinct FileNames containing the phrase 'Nieminen' (I've browsed it).
Can it be possible that GROUP BY in sqlite compares only N (e.g. 10) initial characters? From my observation it might be true...
Any clues?
why don't you try out the following:
SELECT DISTINCT FileName FROM TableName WHERE FileName LIKE '%Nieminen%'

SQLITE - switch data in columns

I've got a table with the following structure:
CREATE TABLE "mytable" ("column01" INTEGER NOT NULL , "column02" INTEGER NOT NULL )
And I want to switch the values between columns - I want column02 to become column01 and column01 to become column02.
i.e.:
column01 / column02
apple / 01
day / 05
light / 28
And I want it to become:
column01 / column02
01 / apple
05 / day
28 / light
Is there a way to achieve this, using only SQL query?
Thanks.
I just tested the below query and it works:
update mytable set column01 = column02, column02 = column01
I can't try it here, but this might work. It's a swap that needs no temporary variables.
The algorithm is:
x -= y;
y += x; // y gets the original value of x
x = (y - x); // x gets the original value of y
so that would make your UPDATE statement like this:
UPDATE mytable
SET column01 = column01 - column02,
column02 = column02 + column01,
column01 = column02 - column01
It will only work if the columns are evaluated in left-to-right order, and they are evaluated in place, as opposed to from a snapshot of the buffer, which I believe is the case for SQLite.
SQLITE is extremely limited in the ALTER table commands allowed. As stated on the official project site:
SQLite supports a limited subset of
ALTER TABLE. The ALTER TABLE command
in SQLite allows the user to rename a
table or to add a new column to an
existing table. It is not possible to
rename a colum, remove a column, or
add or remove constraints from a
table.
Because of this, and because the two columns you want to swap are seemingly of different types (INTEGER and VARCHAR) I think you will need to export the table contents as sql/csv, drop the table, create a table with the structure you want, and then import the file you dropped back into that table.
One possible way to do this:
sqlite> .output my_outfile.sql
This changes output from displaying on screen to being written to a file.
sqlite> .dump my_table
This method will dump the CREATE TABLE SQL and all the INSERT statements as a transaction. You'll need to modify my_outfile.sql with vi or another editor to manually remove the CREATE TABLE statements, and I think you'll need to also remove the BEGIN, END transaction commands as I've had trouble importing data with them.
sqlite> .output stdout
This brings your "vision" back as command output will show on your screen again.
sqlite> CREATE TABLE. . .
Re-create the table with the column order you want, being sure to change the types (INTEGER/VARCHAR) appropriately.
sqlite> .read my_outfile.sql
This will execute all the SQL commands in the file you dumped earlier, which should result in achieving the goal you were after as the INSERT statements dumped do not associate column names with specific values.
So, that should do it. A bit verbose, but it may be the only way with sqlite.

Resources