I'm trying to load table to a sybase iq database from a text file, and have a trouble loading datetime field... Always get error data type conversion is not possible. I tried a lot of ways in solving it...
creating varchar field and converting it to data
creating temp table and inserting values into my table from temp table using dateformat, cast, convert,
load table table_name(
datetime_column datetime('dd-mm-yyyy hh-mm-ss')
) from ...
Nothing helps. Any help? thanks.
So I found the solution
load table table_name (
temp_date ' | ',
-- dt datetime column
)
from file_name
---------------------------------------
set dateformat dmy;
update table_name set dt = temp_date
ALTER TABLE table_name
DROP temp_date
Related
I constantly retrieve JSON data from some API and put that data into a MariaDB table.
The JSON ships with a timestamp which I'd like to place an index on, because this attribute is used for querying the table.
The JSON looks something like this (stripped):
{
"time": "2021-12-26T14:00:00.007294Z",
"some_measure": "0.10031"
}
I create a table:
CREATE TABLE some_table (
my_json JSON NOT NULL,
time TIMESTAMP AS (JSON_VALUE(my_json , '$.time')),
some_measure DOUBLE AS (JSON_VALUE(my_json , '$.some_measure'))
)
ENGINE=InnoDB
DEFAULT CHARSET=utf8mb4
COLLATE=utf8mb4_general_ci;
my_json holds the entire JSON snippet, time and some_measure are virtual columns properly extracting the corresponding JSON values on the fly.
Now, trying to add an index on the TIMESTAMP attribute:
CREATE INDEX some_index ON some_table (time);
This fails:
SQL Error [1292] [22007]: (conn=454) Incorrect datetime value:
'2021-12-26T14:00:00.007294Z' for column `some_db`.`some_table`.`time` at row 1
How can I add an index on that timestamp?
The issue here is that converting a string (the JSON timestamp) to a TIMESTAMP is non-deterministic because it involves server side settings (sql_mode) and timezone settings.
Indexing virtual columns which are non-deterministic is not supported.
You would want to use a VARCHAR data type instead and index that:
CREATE TABLE some_table (
my_json JSON NOT NULL,
time VARCHAR(100) AS (JSON_VALUE(my_json , '$.time')),
some_measure DOUBLE AS (JSON_VALUE(my_json , '$.some_measure'))
)
ENGINE=InnoDB
DEFAULT CHARSET=utf8mb4
COLLATE=utf8mb4_general_ci;
You should be able to create your index:
CREATE INDEX some_index ON some_table (`time`);
You can still query time because MariaDB automatically converts DATETIMEs if used against a VARCHAR:
SELECT
*
FROM some_table
WHERE time > '2008-12-31 23:59:59' + INTERVAL 1 SECOND;
The query will use the index:
I finally came up with a solution that works for me.
Changes are:
use STR_TO_DATE() to create a valid DATETIME from the JSON timestamp
make the generated (virtual) column PERSISTENT
use data type DATETIME instead of TIMESTAMP
So the new code looks like this:
CREATE TABLE some_table (
my_json JSON NOT NULL,
time DATETIME AS (STR_TO_DATE((JSON_VALUE(my_json , '$.time')), '%Y-%m-%d%#%T%.%#%#')) PERSISTENT,
some_measure DOUBLE AS (JSON_VALUE(my_json , '$.some_measure'))
)
ENGINE=InnoDB
DEFAULT CHARSET=utf8mb4
COLLATE=utf8mb4_general_ci;
CREATE INDEX some_index ON some_table (`time`);
When in DELPHI XE3 with Firedac we create a table table0 and insert text in UTF8 encoding all is OK.
CREATE TABLE table0 (mytext Ntext,publishdate date);
INSERT INTO table0 (mytext,publishdate) VALUES ('привет','1998-12-07');
But, when we create a FTS table and insert text in UTF8.
CREATE VIRTUAL TABLE table1 USING FTS4(mytext,publishdate);
INSERT INTO table1 (mytext,publishdate) VALUES ('привет','1998-12-07');
and read the data with
SELECT mytext FROM table1
we get "??????".
The same commands in SQLITE Expert Personal return "привет". It means that in the table after Insert command we have 'привет' and select returns the data not in UTF8 encoding.
What may be done to get from the table correct value with Firedac
ADQuery.Open('SELECT mytext FROM table1')?
I think it is FireDac bug.
I've changed the following lines in ADSQLiteTypeName2ADDataType procedure of uADPhysSQLite.pas unit
SetLen(AOptions.FormatOptions.MaxStringSize, False);
AType := dtAnsiString;
to
SetLen(AOptions.FormatOptions.MaxStringSize, True);
AType := dtWideString;
for the case ABaseTypeName = 'VARCHAR'
and
SELECT mytext FROM table0
returns the correct value at runtime. But at design time still we have '??????'.
But I don't think it's a good solution.
I believe you should directly set parameter type as AsWideString:
Query.SQL.Text:='INSERT INTO table1 (mytext, publishdate) VALUES (:mytext,: publishdate);';
Query.Params[0].AsWideString := mytext;
Query.Params[1].AsDate := publishdate;
Query.ExecSQL;
Reference: http://docwiki.embarcadero.com/RADStudio/Rio/en/Unicode_Support_(FireDAC)#Parameter_Values
I created a table tb with the following schema:
CREATE TABLE tb (name TEXT, dt TEXT DEFAULT (date('now', 'localtime')));
Then a wrong date was inserted into tb on purpose:
INSERT INTO tb(name, dt) VALUES("somebody","2015-99-99");
It worked without any warning, and with SELECT * FROM tb I got:
somebody|2015-99-99
The date() function showed that the input was wrong, (SELECT DATE(dt) FROM tb returned nothing), but at the time of inputting we did not know the error.
Is it possible for sqlite3 to check if a timestamp input is correct or not?
(Sqlite version: 3.9.1)
To check a field, use a CHECK constraint:
CREATE TABLE tb (
name TEXT,
dt TEXT DEFAULT (date('now', 'localtime')) CHECK (date(dt) IS NOT NULL)
);
In my SQl base the type of the item_id is bigint(20)
I need to extract a list of item_id and then insert it into another table to make a join then.
But R converts item_id into the "double" in the result of the query.
{query.1<-c("Select ITEM_ID FROM DISPLAY WHERE client_key=121")
query.1 <- paste (query.1, collapse = " ")
items<- dbGetQuery(connect.base, query.1)
typeof(items$ITEM_ID)
[1] "double"}
So I can't input this values into the new table as the join will not work
there is no bigint(20) in R, so probably I need to convert to caracter, but how can i do it within RMySQL query?
Any help will be much appriciate
You have to create a table in mysql with specified fields. Like
dbSendQuery(connect.base, "create table new_table(
ITEM_ID bigint not null,
key(ITEM_ID))ENGINE=InnoDB DEFAULT CHARSET=utf8;")
Then if you write a DB table with R using append=TRUE the value will be in the right format:
dbWriteTable(connect.base, "new_table",items, append=TRUE,row.names=F)
Here I am Giving The Queries What I am using:
1.INSERT into sample values(convert(varchar,'19-11-2014 10:10:41',103))
2.INSERT into sample values (convert(varchar,'2014-11-19 10:10:41',103))
3.INSERT into sample values(convert(varchar,'11-19-2014 10:10:41',103))
The database Format is:yyyy/mm/dd HH:mm:ss:mmm
In Above Queries First one Throwing Error and Remaining Two Queries Working Fine.so How to insert Any Datetime Format into Sql No change in Query.
Please Reply As Early As Possible,Thank You.
Since you said using ASP.Net then why are you using plain SQL queries?
Use parameters instead.
Lets say you have a date in string format 19/11/2014 and you need to insert into db so the correct way will be first convert the string into a date like
DateTime date= DateTime.ParseExact("19/11/2014 00:00:00","dd/MM/yyyy HH:mm:ss", System.Globalization.CultureInfo.InvariantCulture);
now use parametrised query to insert see details-->http://www.aspsnippets.com/Articles/ASPNet-SqlDataSource-pass-value-to-SelectParameter-using-QueryString-Parameter-Example.aspx
Using parameters will serve you two main purpose
You will be safe from SQL Injection attack
the problem that you are currently facing with format of DateTime will go away since the serialization will be done by ASP.Net.
this is very general , but complicated situation to get the input date which in proper format or not. Which ultimately gives an error at run-time , even in the production server we get the error and we rectify the issue hour and hour.
Ultimately sqlserver introduce the new function called "ISDATE" or TRY_PARSE-only sql2012 or above, which return the 1 if input data is date format.
declare #dTable table ( datecolumn datetime)
INSERT into #dTable values (case isdate('19-11-2014 10:10:41') when 1 then '19-11-2014 10:10:41' else null end )
INSERT into #dTable values ('2014-11-19 10:10:41')
INSERT into #dTable values ('11-19-2014 10:10:41')
select * from #dTable
If still you want , this is not right way, then you can create a function of dateformat, in which you can give string datevalue in any format and function give the data date or null return.
check this links.
http://www.codeproject.com/Articles/576178/cast-convert-format-try-parse-date-and-time-sql#4
create function convertStringIntoDate
(
#stringValue varchar(50)
)
RETURNS datetime
AS
BEGIN
DECLARE #datereturn datetime
set #datereturn = TRY_PARSE( #stringValue)
RETURN #datereturn
END
or you can check this logic too.
declare #dt varchar(50) = '19-11-2014 10:10:41'
declare #dTable table ( datecolumn datetime)
INSERT into #dTable values (
case
when isdate(CONVERT( varchar(50), #dt)) = 1 then CONVERT( varchar(50), #dt) --'19-11-2014 10:10:41'
when isdate(CONVERT( varchar(50), #dt, 103) ) = 1 then CONVERT( datetime, #dt , 103 ) --'19-11-2014 10:10:41'
when isdate(CONVERT( varchar(50), #dt, 102) ) = 1 then CONVERT( datetime, #dt , 102 ) --'19-11-2014 10:10:41'
--when --give other format as above given and if not set in any dateformat , then simply return null
else
null
end )
select * from #dTable
Try this.
INSERT into sample values (convert(datetime,'19-11-2014 10:10:41',103))
INSERT into sample values (convert(datetime,'2014-11-19 10:10:41',102))
INSERT into sample values (convert(datetime,'11-19-2014 10:10:41',102))